Jan 15 13:43:52.023668 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:40:50 -00 2025 Jan 15 13:43:52.023714 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8945029ddd0f3864592f8746dde99cfcba228e0d3cb946f5938103dbe8733507 Jan 15 13:43:52.023727 kernel: BIOS-provided physical RAM map: Jan 15 13:43:52.023742 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jan 15 13:43:52.023752 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jan 15 13:43:52.023761 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jan 15 13:43:52.023772 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Jan 15 13:43:52.023782 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Jan 15 13:43:52.023792 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jan 15 13:43:52.023802 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jan 15 13:43:52.023812 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 15 13:43:52.023821 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jan 15 13:43:52.023843 kernel: NX (Execute Disable) protection: active Jan 15 13:43:52.023854 kernel: APIC: Static calls initialized Jan 15 13:43:52.023866 kernel: SMBIOS 2.8 present. Jan 15 13:43:52.023882 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Jan 15 13:43:52.023893 kernel: Hypervisor detected: KVM Jan 15 13:43:52.023909 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 15 13:43:52.023920 kernel: kvm-clock: using sched offset of 4721644765 cycles Jan 15 13:43:52.023932 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 15 13:43:52.023943 kernel: tsc: Detected 2799.998 MHz processor Jan 15 13:43:52.023954 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 15 13:43:52.023965 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 15 13:43:52.023976 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Jan 15 13:43:52.023987 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jan 15 13:43:52.023997 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 15 13:43:52.024013 kernel: Using GB pages for direct mapping Jan 15 13:43:52.024024 kernel: ACPI: Early table checksum verification disabled Jan 15 13:43:52.024035 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Jan 15 13:43:52.024046 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024057 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024068 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024078 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Jan 15 13:43:52.024089 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024100 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024115 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024126 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 13:43:52.024137 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Jan 15 13:43:52.024148 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Jan 15 13:43:52.024159 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Jan 15 13:43:52.024176 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Jan 15 13:43:52.024188 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Jan 15 13:43:52.024204 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Jan 15 13:43:52.024215 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Jan 15 13:43:52.024227 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 15 13:43:52.024271 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Jan 15 13:43:52.024283 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 15 13:43:52.024295 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Jan 15 13:43:52.024306 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 15 13:43:52.024323 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Jan 15 13:43:52.024335 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 15 13:43:52.024346 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Jan 15 13:43:52.024357 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 15 13:43:52.024369 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Jan 15 13:43:52.024380 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 15 13:43:52.024391 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Jan 15 13:43:52.024402 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 15 13:43:52.024413 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Jan 15 13:43:52.024429 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 15 13:43:52.024446 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Jan 15 13:43:52.024458 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 15 13:43:52.024469 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 15 13:43:52.024481 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Jan 15 13:43:52.024492 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Jan 15 13:43:52.024504 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Jan 15 13:43:52.024515 kernel: Zone ranges: Jan 15 13:43:52.024526 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 15 13:43:52.024538 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Jan 15 13:43:52.024554 kernel: Normal empty Jan 15 13:43:52.024565 kernel: Movable zone start for each node Jan 15 13:43:52.024577 kernel: Early memory node ranges Jan 15 13:43:52.024588 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jan 15 13:43:52.024599 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Jan 15 13:43:52.024610 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Jan 15 13:43:52.024621 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 15 13:43:52.024633 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jan 15 13:43:52.024648 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Jan 15 13:43:52.024661 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 15 13:43:52.024685 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 15 13:43:52.024698 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 15 13:43:52.024710 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 15 13:43:52.024721 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 15 13:43:52.024732 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 15 13:43:52.024743 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 15 13:43:52.024755 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 15 13:43:52.024766 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 15 13:43:52.024777 kernel: TSC deadline timer available Jan 15 13:43:52.024794 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Jan 15 13:43:52.024806 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 15 13:43:52.024817 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jan 15 13:43:52.024828 kernel: Booting paravirtualized kernel on KVM Jan 15 13:43:52.024840 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 15 13:43:52.024851 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Jan 15 13:43:52.024863 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 15 13:43:52.024874 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 15 13:43:52.024885 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Jan 15 13:43:52.024902 kernel: kvm-guest: PV spinlocks enabled Jan 15 13:43:52.024913 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 15 13:43:52.024926 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8945029ddd0f3864592f8746dde99cfcba228e0d3cb946f5938103dbe8733507 Jan 15 13:43:52.024938 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 15 13:43:52.024949 kernel: random: crng init done Jan 15 13:43:52.024964 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 15 13:43:52.024984 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 15 13:43:52.025003 kernel: Fallback order for Node 0: 0 Jan 15 13:43:52.025031 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Jan 15 13:43:52.025057 kernel: Policy zone: DMA32 Jan 15 13:43:52.025071 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 15 13:43:52.025083 kernel: software IO TLB: area num 16. Jan 15 13:43:52.025094 kernel: Memory: 1901528K/2096616K available (12288K kernel code, 2299K rwdata, 22728K rodata, 42844K init, 2348K bss, 194828K reserved, 0K cma-reserved) Jan 15 13:43:52.025106 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Jan 15 13:43:52.025117 kernel: Kernel/User page tables isolation: enabled Jan 15 13:43:52.025129 kernel: ftrace: allocating 37918 entries in 149 pages Jan 15 13:43:52.025145 kernel: ftrace: allocated 149 pages with 4 groups Jan 15 13:43:52.025157 kernel: Dynamic Preempt: voluntary Jan 15 13:43:52.025168 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 15 13:43:52.025181 kernel: rcu: RCU event tracing is enabled. Jan 15 13:43:52.025192 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Jan 15 13:43:52.025204 kernel: Trampoline variant of Tasks RCU enabled. Jan 15 13:43:52.027265 kernel: Rude variant of Tasks RCU enabled. Jan 15 13:43:52.027291 kernel: Tracing variant of Tasks RCU enabled. Jan 15 13:43:52.027303 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 15 13:43:52.027315 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Jan 15 13:43:52.027327 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Jan 15 13:43:52.027339 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 15 13:43:52.027355 kernel: Console: colour VGA+ 80x25 Jan 15 13:43:52.027378 kernel: printk: console [tty0] enabled Jan 15 13:43:52.027389 kernel: printk: console [ttyS0] enabled Jan 15 13:43:52.027400 kernel: ACPI: Core revision 20230628 Jan 15 13:43:52.027411 kernel: APIC: Switch to symmetric I/O mode setup Jan 15 13:43:52.027426 kernel: x2apic enabled Jan 15 13:43:52.027437 kernel: APIC: Switched APIC routing to: physical x2apic Jan 15 13:43:52.027455 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Jan 15 13:43:52.027467 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Jan 15 13:43:52.027478 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 15 13:43:52.027489 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 15 13:43:52.027500 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 15 13:43:52.027511 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 15 13:43:52.027534 kernel: Spectre V2 : Mitigation: Retpolines Jan 15 13:43:52.027545 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 15 13:43:52.027562 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Jan 15 13:43:52.027573 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Jan 15 13:43:52.027596 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 15 13:43:52.027607 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 15 13:43:52.027619 kernel: MDS: Mitigation: Clear CPU buffers Jan 15 13:43:52.027630 kernel: MMIO Stale Data: Unknown: No mitigations Jan 15 13:43:52.027641 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 15 13:43:52.027653 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 15 13:43:52.027686 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 15 13:43:52.027700 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 15 13:43:52.027712 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 15 13:43:52.027730 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Jan 15 13:43:52.027755 kernel: Freeing SMP alternatives memory: 32K Jan 15 13:43:52.027775 kernel: pid_max: default: 32768 minimum: 301 Jan 15 13:43:52.027787 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 15 13:43:52.027799 kernel: landlock: Up and running. Jan 15 13:43:52.027818 kernel: SELinux: Initializing. Jan 15 13:43:52.027830 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 15 13:43:52.027842 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 15 13:43:52.027854 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Jan 15 13:43:52.027866 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 15 13:43:52.027878 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 15 13:43:52.027897 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 15 13:43:52.027909 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Jan 15 13:43:52.027921 kernel: signal: max sigframe size: 1776 Jan 15 13:43:52.027933 kernel: rcu: Hierarchical SRCU implementation. Jan 15 13:43:52.027946 kernel: rcu: Max phase no-delay instances is 400. Jan 15 13:43:52.027958 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 15 13:43:52.027969 kernel: smp: Bringing up secondary CPUs ... Jan 15 13:43:52.027982 kernel: smpboot: x86: Booting SMP configuration: Jan 15 13:43:52.027993 kernel: .... node #0, CPUs: #1 Jan 15 13:43:52.028011 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 15 13:43:52.028023 kernel: smp: Brought up 1 node, 2 CPUs Jan 15 13:43:52.028035 kernel: smpboot: Max logical packages: 16 Jan 15 13:43:52.028047 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Jan 15 13:43:52.028059 kernel: devtmpfs: initialized Jan 15 13:43:52.028071 kernel: x86/mm: Memory block size: 128MB Jan 15 13:43:52.028083 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 15 13:43:52.028095 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Jan 15 13:43:52.028107 kernel: pinctrl core: initialized pinctrl subsystem Jan 15 13:43:52.028124 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 15 13:43:52.028137 kernel: audit: initializing netlink subsys (disabled) Jan 15 13:43:52.028149 kernel: audit: type=2000 audit(1736948631.101:1): state=initialized audit_enabled=0 res=1 Jan 15 13:43:52.028160 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 15 13:43:52.028172 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 15 13:43:52.028184 kernel: cpuidle: using governor menu Jan 15 13:43:52.028209 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 15 13:43:52.028220 kernel: dca service started, version 1.12.1 Jan 15 13:43:52.028232 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Jan 15 13:43:52.028274 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Jan 15 13:43:52.028286 kernel: PCI: Using configuration type 1 for base access Jan 15 13:43:52.028299 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 15 13:43:52.028311 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 15 13:43:52.028323 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 15 13:43:52.028335 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 15 13:43:52.028346 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 15 13:43:52.028358 kernel: ACPI: Added _OSI(Module Device) Jan 15 13:43:52.028370 kernel: ACPI: Added _OSI(Processor Device) Jan 15 13:43:52.028388 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 15 13:43:52.028400 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 15 13:43:52.028412 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 15 13:43:52.028424 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 15 13:43:52.028436 kernel: ACPI: Interpreter enabled Jan 15 13:43:52.028448 kernel: ACPI: PM: (supports S0 S5) Jan 15 13:43:52.028460 kernel: ACPI: Using IOAPIC for interrupt routing Jan 15 13:43:52.028472 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 15 13:43:52.028484 kernel: PCI: Using E820 reservations for host bridge windows Jan 15 13:43:52.028501 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 15 13:43:52.028513 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 15 13:43:52.028795 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 15 13:43:52.028975 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 15 13:43:52.029142 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 15 13:43:52.029161 kernel: PCI host bridge to bus 0000:00 Jan 15 13:43:52.033394 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 15 13:43:52.033573 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 15 13:43:52.033748 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 15 13:43:52.033906 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Jan 15 13:43:52.034061 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 15 13:43:52.034217 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Jan 15 13:43:52.036426 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 15 13:43:52.036631 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Jan 15 13:43:52.036874 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Jan 15 13:43:52.037046 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Jan 15 13:43:52.037213 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Jan 15 13:43:52.039425 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Jan 15 13:43:52.039615 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 15 13:43:52.039824 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.040002 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Jan 15 13:43:52.040180 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.040371 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Jan 15 13:43:52.040548 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.040730 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Jan 15 13:43:52.040926 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.041105 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Jan 15 13:43:52.043328 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.043530 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Jan 15 13:43:52.043734 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.043907 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Jan 15 13:43:52.044095 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.044299 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Jan 15 13:43:52.044482 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 15 13:43:52.044652 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Jan 15 13:43:52.044878 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Jan 15 13:43:52.045050 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Jan 15 13:43:52.045220 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Jan 15 13:43:52.048649 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Jan 15 13:43:52.048852 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Jan 15 13:43:52.049046 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Jan 15 13:43:52.049220 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Jan 15 13:43:52.049408 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Jan 15 13:43:52.049578 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Jan 15 13:43:52.049771 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Jan 15 13:43:52.049943 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 15 13:43:52.050154 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Jan 15 13:43:52.052361 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Jan 15 13:43:52.052566 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Jan 15 13:43:52.052876 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Jan 15 13:43:52.053089 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Jan 15 13:43:52.053319 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Jan 15 13:43:52.053504 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Jan 15 13:43:52.053673 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 13:43:52.053854 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 15 13:43:52.054019 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 15 13:43:52.054212 kernel: pci_bus 0000:02: extended config space not accessible Jan 15 13:43:52.056465 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Jan 15 13:43:52.056725 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Jan 15 13:43:52.056912 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 13:43:52.057092 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 15 13:43:52.057318 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 15 13:43:52.057495 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Jan 15 13:43:52.057663 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 13:43:52.057842 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 15 13:43:52.058016 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 15 13:43:52.058200 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 15 13:43:52.058401 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Jan 15 13:43:52.058571 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 13:43:52.058750 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 15 13:43:52.058915 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 15 13:43:52.059085 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 13:43:52.059277 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 15 13:43:52.059453 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 15 13:43:52.059621 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 13:43:52.059802 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 15 13:43:52.059972 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 15 13:43:52.060142 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 13:43:52.060325 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 15 13:43:52.060492 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 15 13:43:52.060660 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 13:43:52.060849 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 15 13:43:52.061016 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 15 13:43:52.061185 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 13:43:52.061373 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 15 13:43:52.061540 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 15 13:43:52.061559 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 15 13:43:52.061572 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 15 13:43:52.061585 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 15 13:43:52.061604 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 15 13:43:52.061617 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 15 13:43:52.061629 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 15 13:43:52.061641 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 15 13:43:52.061653 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 15 13:43:52.061665 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 15 13:43:52.061688 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 15 13:43:52.061702 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 15 13:43:52.061714 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 15 13:43:52.061732 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 15 13:43:52.061744 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 15 13:43:52.061756 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 15 13:43:52.061768 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 15 13:43:52.061781 kernel: iommu: Default domain type: Translated Jan 15 13:43:52.061793 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 15 13:43:52.061805 kernel: PCI: Using ACPI for IRQ routing Jan 15 13:43:52.061817 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 15 13:43:52.061829 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jan 15 13:43:52.061847 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Jan 15 13:43:52.062012 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 15 13:43:52.062180 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 15 13:43:52.062453 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 15 13:43:52.062474 kernel: vgaarb: loaded Jan 15 13:43:52.062487 kernel: clocksource: Switched to clocksource kvm-clock Jan 15 13:43:52.062499 kernel: VFS: Disk quotas dquot_6.6.0 Jan 15 13:43:52.062512 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 15 13:43:52.062531 kernel: pnp: PnP ACPI init Jan 15 13:43:52.062727 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jan 15 13:43:52.062748 kernel: pnp: PnP ACPI: found 5 devices Jan 15 13:43:52.062761 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 15 13:43:52.062773 kernel: NET: Registered PF_INET protocol family Jan 15 13:43:52.062785 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 13:43:52.062797 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 15 13:43:52.062810 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 15 13:43:52.062829 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 15 13:43:52.062842 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 15 13:43:52.062854 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 15 13:43:52.062866 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 15 13:43:52.062879 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 15 13:43:52.062891 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 15 13:43:52.062903 kernel: NET: Registered PF_XDP protocol family Jan 15 13:43:52.063065 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Jan 15 13:43:52.063243 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 15 13:43:52.063423 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 15 13:43:52.063590 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 15 13:43:52.063768 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 15 13:43:52.063933 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 15 13:43:52.064099 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 15 13:43:52.064292 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 15 13:43:52.064466 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 15 13:43:52.064630 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 15 13:43:52.064807 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 15 13:43:52.064970 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 15 13:43:52.065135 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 15 13:43:52.065323 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 15 13:43:52.065490 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 15 13:43:52.065664 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 15 13:43:52.065876 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 13:43:52.066055 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 15 13:43:52.066222 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 13:43:52.066435 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 15 13:43:52.066600 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 15 13:43:52.066777 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 15 13:43:52.066942 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 13:43:52.067132 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 15 13:43:52.067345 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 15 13:43:52.067513 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 15 13:43:52.067689 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 13:43:52.067857 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 15 13:43:52.068022 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 15 13:43:52.068195 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 15 13:43:52.068402 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 13:43:52.068569 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 15 13:43:52.068748 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 15 13:43:52.068915 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 15 13:43:52.069083 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 13:43:52.069287 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 15 13:43:52.069455 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 15 13:43:52.069621 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 15 13:43:52.069798 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 13:43:52.069972 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 15 13:43:52.070137 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 15 13:43:52.070329 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 15 13:43:52.070495 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 13:43:52.070668 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 15 13:43:52.070849 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 15 13:43:52.071014 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 15 13:43:52.071178 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 13:43:52.071362 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 15 13:43:52.071530 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 15 13:43:52.071713 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 15 13:43:52.071875 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 15 13:43:52.072030 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 15 13:43:52.072191 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 15 13:43:52.072370 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Jan 15 13:43:52.072526 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jan 15 13:43:52.072690 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Jan 15 13:43:52.072867 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 15 13:43:52.073030 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Jan 15 13:43:52.073208 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jan 15 13:43:52.073430 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Jan 15 13:43:52.073640 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Jan 15 13:43:52.073814 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Jan 15 13:43:52.073972 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 15 13:43:52.074156 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Jan 15 13:43:52.074384 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Jan 15 13:43:52.074543 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 15 13:43:52.074730 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Jan 15 13:43:52.074888 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Jan 15 13:43:52.075068 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 15 13:43:52.075273 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Jan 15 13:43:52.075434 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Jan 15 13:43:52.075591 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 15 13:43:52.075780 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Jan 15 13:43:52.075949 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Jan 15 13:43:52.076109 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 15 13:43:52.076324 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Jan 15 13:43:52.076486 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Jan 15 13:43:52.076644 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 15 13:43:52.076824 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Jan 15 13:43:52.076989 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Jan 15 13:43:52.077145 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 15 13:43:52.077166 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 15 13:43:52.077179 kernel: PCI: CLS 0 bytes, default 64 Jan 15 13:43:52.077192 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 15 13:43:52.077205 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Jan 15 13:43:52.077218 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 15 13:43:52.077248 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Jan 15 13:43:52.077264 kernel: Initialise system trusted keyrings Jan 15 13:43:52.077284 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 15 13:43:52.077297 kernel: Key type asymmetric registered Jan 15 13:43:52.077310 kernel: Asymmetric key parser 'x509' registered Jan 15 13:43:52.077322 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 15 13:43:52.077335 kernel: io scheduler mq-deadline registered Jan 15 13:43:52.077348 kernel: io scheduler kyber registered Jan 15 13:43:52.077360 kernel: io scheduler bfq registered Jan 15 13:43:52.077529 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 15 13:43:52.077710 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 15 13:43:52.077899 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.078071 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 15 13:43:52.078293 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 15 13:43:52.078467 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.078635 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 15 13:43:52.078813 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 15 13:43:52.078989 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.079156 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 15 13:43:52.079337 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 15 13:43:52.079503 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.079670 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 15 13:43:52.079846 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 15 13:43:52.080019 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.080187 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 15 13:43:52.080385 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 15 13:43:52.080551 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.080729 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 15 13:43:52.080895 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 15 13:43:52.081070 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.081249 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 15 13:43:52.081427 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 15 13:43:52.081593 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 13:43:52.081613 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 15 13:43:52.081627 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 15 13:43:52.081647 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 15 13:43:52.081660 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 15 13:43:52.081673 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 15 13:43:52.081697 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 15 13:43:52.081710 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 15 13:43:52.081723 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 15 13:43:52.081893 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 15 13:43:52.081914 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 15 13:43:52.082076 kernel: rtc_cmos 00:03: registered as rtc0 Jan 15 13:43:52.082247 kernel: rtc_cmos 00:03: setting system clock to 2025-01-15T13:43:51 UTC (1736948631) Jan 15 13:43:52.082410 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 15 13:43:52.082429 kernel: intel_pstate: CPU model not supported Jan 15 13:43:52.082442 kernel: NET: Registered PF_INET6 protocol family Jan 15 13:43:52.082455 kernel: Segment Routing with IPv6 Jan 15 13:43:52.082468 kernel: In-situ OAM (IOAM) with IPv6 Jan 15 13:43:52.082480 kernel: NET: Registered PF_PACKET protocol family Jan 15 13:43:52.082493 kernel: Key type dns_resolver registered Jan 15 13:43:52.082513 kernel: IPI shorthand broadcast: enabled Jan 15 13:43:52.082526 kernel: sched_clock: Marking stable (1386004026, 219790304)->(1829814100, -224019770) Jan 15 13:43:52.082539 kernel: registered taskstats version 1 Jan 15 13:43:52.082552 kernel: Loading compiled-in X.509 certificates Jan 15 13:43:52.082565 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: e8ca4908f7ff887d90a0430272c92dde55624447' Jan 15 13:43:52.082577 kernel: Key type .fscrypt registered Jan 15 13:43:52.082590 kernel: Key type fscrypt-provisioning registered Jan 15 13:43:52.082603 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 15 13:43:52.082621 kernel: ima: Allocated hash algorithm: sha1 Jan 15 13:43:52.082634 kernel: ima: No architecture policies found Jan 15 13:43:52.082646 kernel: clk: Disabling unused clocks Jan 15 13:43:52.082659 kernel: Freeing unused kernel image (initmem) memory: 42844K Jan 15 13:43:52.082672 kernel: Write protecting the kernel read-only data: 36864k Jan 15 13:43:52.082695 kernel: Freeing unused kernel image (rodata/data gap) memory: 1848K Jan 15 13:43:52.082708 kernel: Run /init as init process Jan 15 13:43:52.082721 kernel: with arguments: Jan 15 13:43:52.082734 kernel: /init Jan 15 13:43:52.082746 kernel: with environment: Jan 15 13:43:52.082765 kernel: HOME=/ Jan 15 13:43:52.082778 kernel: TERM=linux Jan 15 13:43:52.082790 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 15 13:43:52.082806 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 15 13:43:52.082822 systemd[1]: Detected virtualization kvm. Jan 15 13:43:52.082835 systemd[1]: Detected architecture x86-64. Jan 15 13:43:52.082848 systemd[1]: Running in initrd. Jan 15 13:43:52.082867 systemd[1]: No hostname configured, using default hostname. Jan 15 13:43:52.082880 systemd[1]: Hostname set to . Jan 15 13:43:52.082894 systemd[1]: Initializing machine ID from VM UUID. Jan 15 13:43:52.082907 systemd[1]: Queued start job for default target initrd.target. Jan 15 13:43:52.082920 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 13:43:52.082934 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 13:43:52.082948 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 15 13:43:52.082962 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 13:43:52.082980 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 15 13:43:52.082994 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 15 13:43:52.083009 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 15 13:43:52.083023 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 15 13:43:52.083037 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 13:43:52.083050 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 13:43:52.083063 systemd[1]: Reached target paths.target - Path Units. Jan 15 13:43:52.083082 systemd[1]: Reached target slices.target - Slice Units. Jan 15 13:43:52.083096 systemd[1]: Reached target swap.target - Swaps. Jan 15 13:43:52.083109 systemd[1]: Reached target timers.target - Timer Units. Jan 15 13:43:52.083122 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 13:43:52.083136 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 13:43:52.083150 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 15 13:43:52.083168 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 15 13:43:52.083182 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 13:43:52.083196 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 13:43:52.083215 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 13:43:52.083245 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 13:43:52.083262 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 15 13:43:52.083276 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 13:43:52.083289 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 15 13:43:52.083303 systemd[1]: Starting systemd-fsck-usr.service... Jan 15 13:43:52.083316 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 13:43:52.083330 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 13:43:52.083350 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 13:43:52.083363 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 15 13:43:52.083377 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 13:43:52.083432 systemd-journald[201]: Collecting audit messages is disabled. Jan 15 13:43:52.083469 systemd[1]: Finished systemd-fsck-usr.service. Jan 15 13:43:52.083484 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 13:43:52.083498 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 15 13:43:52.083511 kernel: Bridge firewalling registered Jan 15 13:43:52.083525 systemd-journald[201]: Journal started Jan 15 13:43:52.083556 systemd-journald[201]: Runtime Journal (/run/log/journal/2d61c347ab1c4c4ca3ef23f6fb02c009) is 4.7M, max 38.0M, 33.2M free. Jan 15 13:43:52.025326 systemd-modules-load[202]: Inserted module 'overlay' Jan 15 13:43:52.112875 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 13:43:52.055531 systemd-modules-load[202]: Inserted module 'br_netfilter' Jan 15 13:43:52.116032 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 13:43:52.118175 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 13:43:52.125418 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 13:43:52.127414 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 13:43:52.143423 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 13:43:52.145507 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 13:43:52.160601 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 13:43:52.165612 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 13:43:52.170766 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 13:43:52.178446 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 15 13:43:52.181139 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 13:43:52.183086 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 13:43:52.193481 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 13:43:52.197147 dracut-cmdline[233]: dracut-dracut-053 Jan 15 13:43:52.199873 dracut-cmdline[233]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8945029ddd0f3864592f8746dde99cfcba228e0d3cb946f5938103dbe8733507 Jan 15 13:43:52.242130 systemd-resolved[238]: Positive Trust Anchors: Jan 15 13:43:52.243129 systemd-resolved[238]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 13:43:52.243172 systemd-resolved[238]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 13:43:52.247039 systemd-resolved[238]: Defaulting to hostname 'linux'. Jan 15 13:43:52.249968 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 13:43:52.251838 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 13:43:52.313298 kernel: SCSI subsystem initialized Jan 15 13:43:52.324267 kernel: Loading iSCSI transport class v2.0-870. Jan 15 13:43:52.337258 kernel: iscsi: registered transport (tcp) Jan 15 13:43:52.361570 kernel: iscsi: registered transport (qla4xxx) Jan 15 13:43:52.361648 kernel: QLogic iSCSI HBA Driver Jan 15 13:43:52.416107 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 15 13:43:52.425458 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 15 13:43:52.456628 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 15 13:43:52.456727 kernel: device-mapper: uevent: version 1.0.3 Jan 15 13:43:52.456748 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 15 13:43:52.507273 kernel: raid6: sse2x4 gen() 14808 MB/s Jan 15 13:43:52.523273 kernel: raid6: sse2x2 gen() 9820 MB/s Jan 15 13:43:52.541798 kernel: raid6: sse2x1 gen() 9849 MB/s Jan 15 13:43:52.541841 kernel: raid6: using algorithm sse2x4 gen() 14808 MB/s Jan 15 13:43:52.560777 kernel: raid6: .... xor() 7609 MB/s, rmw enabled Jan 15 13:43:52.560879 kernel: raid6: using ssse3x2 recovery algorithm Jan 15 13:43:52.585291 kernel: xor: automatically using best checksumming function avx Jan 15 13:43:52.768332 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 15 13:43:52.783437 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 15 13:43:52.792483 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 13:43:52.808974 systemd-udevd[419]: Using default interface naming scheme 'v255'. Jan 15 13:43:52.815439 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 13:43:52.823446 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 15 13:43:52.844432 dracut-pre-trigger[425]: rd.md=0: removing MD RAID activation Jan 15 13:43:52.886018 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 13:43:52.892502 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 13:43:53.004541 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 13:43:53.014435 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 15 13:43:53.032143 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 15 13:43:53.039225 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 13:43:53.040393 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 13:43:53.042813 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 13:43:53.054424 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 15 13:43:53.078631 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 15 13:43:53.150454 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Jan 15 13:43:53.197321 kernel: cryptd: max_cpu_qlen set to 1000 Jan 15 13:43:53.197345 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Jan 15 13:43:53.197561 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 15 13:43:53.197581 kernel: GPT:17805311 != 125829119 Jan 15 13:43:53.197608 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 15 13:43:53.197631 kernel: GPT:17805311 != 125829119 Jan 15 13:43:53.197647 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 15 13:43:53.197675 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 13:43:53.197702 kernel: ACPI: bus type USB registered Jan 15 13:43:53.197718 kernel: usbcore: registered new interface driver usbfs Jan 15 13:43:53.197734 kernel: usbcore: registered new interface driver hub Jan 15 13:43:53.197750 kernel: usbcore: registered new device driver usb Jan 15 13:43:53.197766 kernel: AVX version of gcm_enc/dec engaged. Jan 15 13:43:53.197788 kernel: AES CTR mode by8 optimization enabled Jan 15 13:43:53.202275 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 15 13:43:53.202364 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 13:43:53.204694 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 13:43:53.205385 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 13:43:53.205454 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 13:43:53.206153 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 13:43:53.216483 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 13:43:53.223275 kernel: libata version 3.00 loaded. Jan 15 13:43:53.234605 kernel: ahci 0000:00:1f.2: version 3.0 Jan 15 13:43:53.297397 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 15 13:43:53.297429 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Jan 15 13:43:53.297642 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 15 13:43:53.297863 kernel: BTRFS: device fsid b8e2d3c5-4bed-4339-bed5-268c66823686 devid 1 transid 37 /dev/vda3 scanned by (udev-worker) (468) Jan 15 13:43:53.297885 kernel: scsi host0: ahci Jan 15 13:43:53.298096 kernel: scsi host1: ahci Jan 15 13:43:53.298336 kernel: scsi host2: ahci Jan 15 13:43:53.298529 kernel: scsi host3: ahci Jan 15 13:43:53.298733 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (474) Jan 15 13:43:53.298754 kernel: scsi host4: ahci Jan 15 13:43:53.298956 kernel: scsi host5: ahci Jan 15 13:43:53.299144 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Jan 15 13:43:53.299164 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Jan 15 13:43:53.299181 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Jan 15 13:43:53.299197 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Jan 15 13:43:53.299213 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Jan 15 13:43:53.301257 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Jan 15 13:43:53.301290 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 15 13:43:53.324489 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Jan 15 13:43:53.324730 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 15 13:43:53.324935 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 15 13:43:53.325135 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Jan 15 13:43:53.325374 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Jan 15 13:43:53.325572 kernel: hub 1-0:1.0: USB hub found Jan 15 13:43:53.325808 kernel: hub 1-0:1.0: 4 ports detected Jan 15 13:43:53.326014 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 15 13:43:53.328261 kernel: hub 2-0:1.0: USB hub found Jan 15 13:43:53.328480 kernel: hub 2-0:1.0: 4 ports detected Jan 15 13:43:53.286982 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 15 13:43:53.402560 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 13:43:53.410365 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 15 13:43:53.422280 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 13:43:53.427898 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 15 13:43:53.428708 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 15 13:43:53.436442 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 15 13:43:53.441298 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 13:43:53.444929 disk-uuid[564]: Primary Header is updated. Jan 15 13:43:53.444929 disk-uuid[564]: Secondary Entries is updated. Jan 15 13:43:53.444929 disk-uuid[564]: Secondary Header is updated. Jan 15 13:43:53.451275 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 13:43:53.458332 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 13:43:53.466260 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 13:43:53.473965 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 13:43:53.563278 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 15 13:43:53.606465 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 15 13:43:53.606679 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 15 13:43:53.616269 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 15 13:43:53.616350 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 15 13:43:53.620151 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 15 13:43:53.620218 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 15 13:43:53.710274 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 15 13:43:53.716362 kernel: usbcore: registered new interface driver usbhid Jan 15 13:43:53.716416 kernel: usbhid: USB HID core driver Jan 15 13:43:53.723871 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 15 13:43:53.723908 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Jan 15 13:43:54.466277 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 13:43:54.468383 disk-uuid[565]: The operation has completed successfully. Jan 15 13:43:54.516911 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 15 13:43:54.517101 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 15 13:43:54.536582 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 15 13:43:54.541685 sh[588]: Success Jan 15 13:43:54.558268 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Jan 15 13:43:54.626526 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 15 13:43:54.629401 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 15 13:43:54.632387 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 15 13:43:54.654757 kernel: BTRFS info (device dm-0): first mount of filesystem b8e2d3c5-4bed-4339-bed5-268c66823686 Jan 15 13:43:54.654838 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 15 13:43:54.656781 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 15 13:43:54.660076 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 15 13:43:54.660126 kernel: BTRFS info (device dm-0): using free space tree Jan 15 13:43:54.671373 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 15 13:43:54.672818 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 15 13:43:54.686465 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 15 13:43:54.691449 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 15 13:43:54.702466 kernel: BTRFS info (device vda6): first mount of filesystem 70d8a0b5-70da-4efb-a618-d15543718b1e Jan 15 13:43:54.702541 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 13:43:54.704275 kernel: BTRFS info (device vda6): using free space tree Jan 15 13:43:54.710253 kernel: BTRFS info (device vda6): auto enabling async discard Jan 15 13:43:54.723404 kernel: BTRFS info (device vda6): last unmount of filesystem 70d8a0b5-70da-4efb-a618-d15543718b1e Jan 15 13:43:54.722329 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 15 13:43:54.729664 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 15 13:43:54.738502 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 15 13:43:54.939116 ignition[670]: Ignition 2.19.0 Jan 15 13:43:54.939160 ignition[670]: Stage: fetch-offline Jan 15 13:43:54.939293 ignition[670]: no configs at "/usr/lib/ignition/base.d" Jan 15 13:43:54.939351 ignition[670]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:43:54.939610 ignition[670]: parsed url from cmdline: "" Jan 15 13:43:54.944849 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 13:43:54.939616 ignition[670]: no config URL provided Jan 15 13:43:54.939625 ignition[670]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 13:43:54.939659 ignition[670]: no config at "/usr/lib/ignition/user.ign" Jan 15 13:43:54.939668 ignition[670]: failed to fetch config: resource requires networking Jan 15 13:43:54.940024 ignition[670]: Ignition finished successfully Jan 15 13:43:54.960066 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 13:43:54.967516 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 13:43:55.012850 systemd-networkd[776]: lo: Link UP Jan 15 13:43:55.012866 systemd-networkd[776]: lo: Gained carrier Jan 15 13:43:55.015611 systemd-networkd[776]: Enumeration completed Jan 15 13:43:55.016179 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 13:43:55.016273 systemd-networkd[776]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 13:43:55.016279 systemd-networkd[776]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 13:43:55.017020 systemd[1]: Reached target network.target - Network. Jan 15 13:43:55.017659 systemd-networkd[776]: eth0: Link UP Jan 15 13:43:55.017665 systemd-networkd[776]: eth0: Gained carrier Jan 15 13:43:55.017681 systemd-networkd[776]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 13:43:55.026014 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 15 13:43:55.044345 systemd-networkd[776]: eth0: DHCPv4 address 10.230.66.218/30, gateway 10.230.66.217 acquired from 10.230.66.217 Jan 15 13:43:55.064001 ignition[778]: Ignition 2.19.0 Jan 15 13:43:55.064025 ignition[778]: Stage: fetch Jan 15 13:43:55.064352 ignition[778]: no configs at "/usr/lib/ignition/base.d" Jan 15 13:43:55.064374 ignition[778]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:43:55.064532 ignition[778]: parsed url from cmdline: "" Jan 15 13:43:55.064539 ignition[778]: no config URL provided Jan 15 13:43:55.064548 ignition[778]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 13:43:55.064565 ignition[778]: no config at "/usr/lib/ignition/user.ign" Jan 15 13:43:55.064762 ignition[778]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 15 13:43:55.064908 ignition[778]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 15 13:43:55.064965 ignition[778]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 15 13:43:55.080390 ignition[778]: GET result: OK Jan 15 13:43:55.080893 ignition[778]: parsing config with SHA512: aaf079ec83da74ec87b8c84f52bf9853d504241005dd057b8e1f4b8f45fe1a2a08ded84d06b7cad17a449f71c34a6131a59a8380ad8070860edcfb927d98c617 Jan 15 13:43:55.087185 unknown[778]: fetched base config from "system" Jan 15 13:43:55.087202 unknown[778]: fetched base config from "system" Jan 15 13:43:55.087666 ignition[778]: fetch: fetch complete Jan 15 13:43:55.087211 unknown[778]: fetched user config from "openstack" Jan 15 13:43:55.087675 ignition[778]: fetch: fetch passed Jan 15 13:43:55.089411 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 15 13:43:55.087736 ignition[778]: Ignition finished successfully Jan 15 13:43:55.099536 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 15 13:43:55.130529 ignition[785]: Ignition 2.19.0 Jan 15 13:43:55.130551 ignition[785]: Stage: kargs Jan 15 13:43:55.130845 ignition[785]: no configs at "/usr/lib/ignition/base.d" Jan 15 13:43:55.130865 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:43:55.133949 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 15 13:43:55.131983 ignition[785]: kargs: kargs passed Jan 15 13:43:55.132053 ignition[785]: Ignition finished successfully Jan 15 13:43:55.148459 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 15 13:43:55.166291 ignition[791]: Ignition 2.19.0 Jan 15 13:43:55.166313 ignition[791]: Stage: disks Jan 15 13:43:55.166546 ignition[791]: no configs at "/usr/lib/ignition/base.d" Jan 15 13:43:55.168896 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 15 13:43:55.166566 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:43:55.167719 ignition[791]: disks: disks passed Jan 15 13:43:55.172090 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 15 13:43:55.167790 ignition[791]: Ignition finished successfully Jan 15 13:43:55.173075 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 15 13:43:55.174354 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 13:43:55.175784 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 13:43:55.177045 systemd[1]: Reached target basic.target - Basic System. Jan 15 13:43:55.184482 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 15 13:43:55.205369 systemd-fsck[799]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 15 13:43:55.208066 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 15 13:43:55.214387 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 15 13:43:55.354542 kernel: EXT4-fs (vda9): mounted filesystem 39899d4c-a8b1-4feb-9875-e812cc535888 r/w with ordered data mode. Quota mode: none. Jan 15 13:43:55.355557 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 15 13:43:55.356848 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 15 13:43:55.373417 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 13:43:55.376356 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 15 13:43:55.378130 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 15 13:43:55.380483 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 15 13:43:55.384312 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 15 13:43:55.387690 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (807) Jan 15 13:43:55.384402 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 13:43:55.397256 kernel: BTRFS info (device vda6): first mount of filesystem 70d8a0b5-70da-4efb-a618-d15543718b1e Jan 15 13:43:55.397285 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 13:43:55.397313 kernel: BTRFS info (device vda6): using free space tree Jan 15 13:43:55.397332 kernel: BTRFS info (device vda6): auto enabling async discard Jan 15 13:43:55.402313 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 13:43:55.403193 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 15 13:43:55.414544 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 15 13:43:55.486859 initrd-setup-root[835]: cut: /sysroot/etc/passwd: No such file or directory Jan 15 13:43:55.494957 initrd-setup-root[843]: cut: /sysroot/etc/group: No such file or directory Jan 15 13:43:55.505611 initrd-setup-root[850]: cut: /sysroot/etc/shadow: No such file or directory Jan 15 13:43:55.512252 initrd-setup-root[857]: cut: /sysroot/etc/gshadow: No such file or directory Jan 15 13:43:55.620928 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 15 13:43:55.626357 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 15 13:43:55.629427 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 15 13:43:55.645270 kernel: BTRFS info (device vda6): last unmount of filesystem 70d8a0b5-70da-4efb-a618-d15543718b1e Jan 15 13:43:55.650912 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 15 13:43:55.673056 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 15 13:43:55.695077 ignition[927]: INFO : Ignition 2.19.0 Jan 15 13:43:55.695077 ignition[927]: INFO : Stage: mount Jan 15 13:43:55.698155 ignition[927]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 13:43:55.698155 ignition[927]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:43:55.698155 ignition[927]: INFO : mount: mount passed Jan 15 13:43:55.698155 ignition[927]: INFO : Ignition finished successfully Jan 15 13:43:55.698431 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 15 13:43:56.190998 systemd-networkd[776]: eth0: Gained IPv6LL Jan 15 13:43:57.697169 systemd-networkd[776]: eth0: Ignoring DHCPv6 address 2a02:1348:179:90b6:24:19ff:fee6:42da/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:90b6:24:19ff:fee6:42da/64 assigned by NDisc. Jan 15 13:43:57.697187 systemd-networkd[776]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 15 13:44:02.566317 coreos-metadata[809]: Jan 15 13:44:02.566 WARN failed to locate config-drive, using the metadata service API instead Jan 15 13:44:02.587413 coreos-metadata[809]: Jan 15 13:44:02.587 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 13:44:02.599940 coreos-metadata[809]: Jan 15 13:44:02.599 INFO Fetch successful Jan 15 13:44:02.600809 coreos-metadata[809]: Jan 15 13:44:02.600 INFO wrote hostname srv-6yg2e.gb1.brightbox.com to /sysroot/etc/hostname Jan 15 13:44:02.602146 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 15 13:44:02.602372 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 15 13:44:02.610361 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 15 13:44:02.620934 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 13:44:02.638291 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (944) Jan 15 13:44:02.644282 kernel: BTRFS info (device vda6): first mount of filesystem 70d8a0b5-70da-4efb-a618-d15543718b1e Jan 15 13:44:02.644347 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 13:44:02.644365 kernel: BTRFS info (device vda6): using free space tree Jan 15 13:44:02.649267 kernel: BTRFS info (device vda6): auto enabling async discard Jan 15 13:44:02.652086 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 13:44:02.694258 ignition[961]: INFO : Ignition 2.19.0 Jan 15 13:44:02.695393 ignition[961]: INFO : Stage: files Jan 15 13:44:02.696025 ignition[961]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 13:44:02.696025 ignition[961]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:44:02.697733 ignition[961]: DEBUG : files: compiled without relabeling support, skipping Jan 15 13:44:02.698612 ignition[961]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 15 13:44:02.698612 ignition[961]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 15 13:44:02.702030 ignition[961]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 15 13:44:02.703012 ignition[961]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 15 13:44:02.703012 ignition[961]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 15 13:44:02.702741 unknown[961]: wrote ssh authorized keys file for user: core Jan 15 13:44:02.705858 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 15 13:44:02.705858 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 15 13:44:02.919902 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 15 13:44:03.655932 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 13:44:03.657557 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 13:44:03.672068 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 13:44:03.672068 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 13:44:03.672068 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 15 13:44:03.672068 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 15 13:44:03.672068 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 15 13:44:03.672068 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Jan 15 13:44:04.271129 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 15 13:44:06.495691 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 15 13:44:06.495691 ignition[961]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 15 13:44:06.502722 ignition[961]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 13:44:06.502722 ignition[961]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 13:44:06.502722 ignition[961]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 15 13:44:06.502722 ignition[961]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 15 13:44:06.508585 ignition[961]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 15 13:44:06.508585 ignition[961]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 15 13:44:06.508585 ignition[961]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 15 13:44:06.508585 ignition[961]: INFO : files: files passed Jan 15 13:44:06.508585 ignition[961]: INFO : Ignition finished successfully Jan 15 13:44:06.514788 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 15 13:44:06.525670 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 15 13:44:06.533619 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 15 13:44:06.544765 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 15 13:44:06.545001 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 15 13:44:06.554475 initrd-setup-root-after-ignition[991]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 13:44:06.556818 initrd-setup-root-after-ignition[991]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 15 13:44:06.558944 initrd-setup-root-after-ignition[995]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 13:44:06.560793 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 13:44:06.562571 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 15 13:44:06.568491 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 15 13:44:06.612572 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 15 13:44:06.612800 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 15 13:44:06.615940 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 15 13:44:06.616673 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 15 13:44:06.617625 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 15 13:44:06.621535 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 15 13:44:06.648567 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 13:44:06.655488 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 15 13:44:06.671239 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 15 13:44:06.673142 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 13:44:06.675278 systemd[1]: Stopped target timers.target - Timer Units. Jan 15 13:44:06.676029 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 15 13:44:06.676211 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 13:44:06.678306 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 15 13:44:06.679320 systemd[1]: Stopped target basic.target - Basic System. Jan 15 13:44:06.680666 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 15 13:44:06.682025 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 13:44:06.683523 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 15 13:44:06.685051 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 15 13:44:06.686673 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 13:44:06.688257 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 15 13:44:06.689733 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 15 13:44:06.691249 systemd[1]: Stopped target swap.target - Swaps. Jan 15 13:44:06.692602 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 15 13:44:06.692837 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 15 13:44:06.694556 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 15 13:44:06.695524 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 13:44:06.696931 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 15 13:44:06.697157 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 13:44:06.698429 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 15 13:44:06.698701 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 15 13:44:06.700566 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 15 13:44:06.700806 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 13:44:06.701837 systemd[1]: ignition-files.service: Deactivated successfully. Jan 15 13:44:06.702084 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 15 13:44:06.708584 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 15 13:44:06.718710 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 15 13:44:06.720712 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 15 13:44:06.720948 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 13:44:06.725384 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 15 13:44:06.725598 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 13:44:06.733939 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 15 13:44:06.734089 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 15 13:44:06.751797 ignition[1015]: INFO : Ignition 2.19.0 Jan 15 13:44:06.751797 ignition[1015]: INFO : Stage: umount Jan 15 13:44:06.755157 ignition[1015]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 13:44:06.755157 ignition[1015]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 13:44:06.757964 ignition[1015]: INFO : umount: umount passed Jan 15 13:44:06.756881 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 15 13:44:06.760311 ignition[1015]: INFO : Ignition finished successfully Jan 15 13:44:06.759881 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 15 13:44:06.760053 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 15 13:44:06.763074 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 15 13:44:06.763174 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 15 13:44:06.764269 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 15 13:44:06.764377 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 15 13:44:06.765769 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 15 13:44:06.765913 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 15 13:44:06.767872 systemd[1]: Stopped target network.target - Network. Jan 15 13:44:06.769283 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 15 13:44:06.769373 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 13:44:06.770753 systemd[1]: Stopped target paths.target - Path Units. Jan 15 13:44:06.772779 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 15 13:44:06.776330 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 13:44:06.777643 systemd[1]: Stopped target slices.target - Slice Units. Jan 15 13:44:06.780217 systemd[1]: Stopped target sockets.target - Socket Units. Jan 15 13:44:06.781718 systemd[1]: iscsid.socket: Deactivated successfully. Jan 15 13:44:06.781842 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 13:44:06.782890 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 15 13:44:06.782955 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 13:44:06.785678 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 15 13:44:06.785776 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 15 13:44:06.787015 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 15 13:44:06.787137 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 15 13:44:06.788911 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 15 13:44:06.791114 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 15 13:44:06.792427 systemd-networkd[776]: eth0: DHCPv6 lease lost Jan 15 13:44:06.793039 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 15 13:44:06.793281 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 15 13:44:06.797426 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 15 13:44:06.797726 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 15 13:44:06.800893 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 15 13:44:06.801008 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 15 13:44:06.802464 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 15 13:44:06.802572 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 15 13:44:06.813481 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 15 13:44:06.814179 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 15 13:44:06.814285 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 13:44:06.816448 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 13:44:06.820967 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 15 13:44:06.821220 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 15 13:44:06.836880 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 15 13:44:06.837200 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 13:44:06.840878 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 15 13:44:06.841003 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 15 13:44:06.841940 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 15 13:44:06.841994 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 13:44:06.843545 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 15 13:44:06.843614 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 15 13:44:06.845677 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 15 13:44:06.845780 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 15 13:44:06.847201 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 15 13:44:06.847303 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 13:44:06.854507 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 15 13:44:06.855608 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 15 13:44:06.855694 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 15 13:44:06.858106 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 15 13:44:06.858177 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 15 13:44:06.859782 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 15 13:44:06.859853 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 13:44:06.861309 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 15 13:44:06.861377 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 13:44:06.863336 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 15 13:44:06.863409 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 13:44:06.864177 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 15 13:44:06.864266 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 13:44:06.865095 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 13:44:06.865173 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 13:44:06.866530 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 15 13:44:06.866710 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 15 13:44:06.876569 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 15 13:44:06.876747 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 15 13:44:06.879963 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 15 13:44:06.890563 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 15 13:44:06.900601 systemd[1]: Switching root. Jan 15 13:44:06.937789 systemd-journald[201]: Journal stopped Jan 15 13:44:08.465901 systemd-journald[201]: Received SIGTERM from PID 1 (systemd). Jan 15 13:44:08.466015 kernel: SELinux: policy capability network_peer_controls=1 Jan 15 13:44:08.466041 kernel: SELinux: policy capability open_perms=1 Jan 15 13:44:08.466087 kernel: SELinux: policy capability extended_socket_class=1 Jan 15 13:44:08.466146 kernel: SELinux: policy capability always_check_network=0 Jan 15 13:44:08.466184 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 15 13:44:08.466204 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 15 13:44:08.466222 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 15 13:44:08.467031 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 15 13:44:08.467056 kernel: audit: type=1403 audit(1736948647.186:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 15 13:44:08.467077 systemd[1]: Successfully loaded SELinux policy in 60.281ms. Jan 15 13:44:08.467107 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.634ms. Jan 15 13:44:08.467163 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 15 13:44:08.467186 systemd[1]: Detected virtualization kvm. Jan 15 13:44:08.467206 systemd[1]: Detected architecture x86-64. Jan 15 13:44:08.467225 systemd[1]: Detected first boot. Jan 15 13:44:08.467270 systemd[1]: Hostname set to . Jan 15 13:44:08.467298 systemd[1]: Initializing machine ID from VM UUID. Jan 15 13:44:08.467318 zram_generator::config[1057]: No configuration found. Jan 15 13:44:08.467340 systemd[1]: Populated /etc with preset unit settings. Jan 15 13:44:08.467389 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 15 13:44:08.467424 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 15 13:44:08.467446 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 15 13:44:08.467467 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 15 13:44:08.467486 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 15 13:44:08.467505 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 15 13:44:08.467525 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 15 13:44:08.467545 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 15 13:44:08.467565 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 15 13:44:08.467614 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 15 13:44:08.467666 systemd[1]: Created slice user.slice - User and Session Slice. Jan 15 13:44:08.467689 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 13:44:08.467709 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 13:44:08.467729 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 15 13:44:08.467748 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 15 13:44:08.467768 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 15 13:44:08.467788 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 13:44:08.467835 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 15 13:44:08.467857 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 13:44:08.467877 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 15 13:44:08.467897 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 15 13:44:08.467917 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 15 13:44:08.467936 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 15 13:44:08.467986 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 13:44:08.468009 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 13:44:08.468029 systemd[1]: Reached target slices.target - Slice Units. Jan 15 13:44:08.468049 systemd[1]: Reached target swap.target - Swaps. Jan 15 13:44:08.468068 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 15 13:44:08.468087 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 15 13:44:08.468107 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 13:44:08.468184 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 13:44:08.468209 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 13:44:08.470822 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 15 13:44:08.470854 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 15 13:44:08.470874 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 15 13:44:08.470921 systemd[1]: Mounting media.mount - External Media Directory... Jan 15 13:44:08.470943 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:08.470961 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 15 13:44:08.471017 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 15 13:44:08.471039 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 15 13:44:08.471062 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 15 13:44:08.471081 systemd[1]: Reached target machines.target - Containers. Jan 15 13:44:08.471100 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 15 13:44:08.471144 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 13:44:08.471177 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 13:44:08.471195 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 15 13:44:08.471226 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 13:44:08.472714 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 13:44:08.472742 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 13:44:08.472762 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 15 13:44:08.472781 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 13:44:08.472802 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 15 13:44:08.472822 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 15 13:44:08.472853 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 15 13:44:08.472872 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 15 13:44:08.472923 systemd[1]: Stopped systemd-fsck-usr.service. Jan 15 13:44:08.472946 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 13:44:08.472965 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 13:44:08.472995 kernel: fuse: init (API version 7.39) Jan 15 13:44:08.473014 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 13:44:08.473033 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 15 13:44:08.473052 kernel: loop: module loaded Jan 15 13:44:08.473072 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 13:44:08.473103 systemd[1]: verity-setup.service: Deactivated successfully. Jan 15 13:44:08.473161 systemd[1]: Stopped verity-setup.service. Jan 15 13:44:08.473186 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:08.473205 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 15 13:44:08.473226 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 15 13:44:08.473278 systemd[1]: Mounted media.mount - External Media Directory. Jan 15 13:44:08.473312 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 15 13:44:08.473364 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 15 13:44:08.473387 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 15 13:44:08.473419 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 13:44:08.473441 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 15 13:44:08.473461 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 15 13:44:08.473480 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 13:44:08.473500 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 13:44:08.473552 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 13:44:08.473575 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 13:44:08.473595 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 15 13:44:08.473614 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 15 13:44:08.473634 kernel: ACPI: bus type drm_connector registered Jan 15 13:44:08.473653 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 13:44:08.473700 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 13:44:08.473750 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 13:44:08.473784 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 13:44:08.473803 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 13:44:08.473823 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 13:44:08.473841 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 15 13:44:08.473860 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 13:44:08.473892 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 15 13:44:08.473984 systemd-journald[1146]: Collecting audit messages is disabled. Jan 15 13:44:08.474032 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 15 13:44:08.474053 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 15 13:44:08.474072 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 13:44:08.474100 systemd-journald[1146]: Journal started Jan 15 13:44:08.474132 systemd-journald[1146]: Runtime Journal (/run/log/journal/2d61c347ab1c4c4ca3ef23f6fb02c009) is 4.7M, max 38.0M, 33.2M free. Jan 15 13:44:08.479320 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 15 13:44:07.985011 systemd[1]: Queued start job for default target multi-user.target. Jan 15 13:44:08.009203 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 15 13:44:08.009973 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 15 13:44:08.492247 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 15 13:44:08.517261 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 15 13:44:08.521256 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 13:44:08.537291 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 15 13:44:08.543285 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 13:44:08.556263 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 15 13:44:08.563258 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 13:44:08.579272 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 13:44:08.600265 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 15 13:44:08.617860 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 13:44:08.622514 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 13:44:08.628516 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 15 13:44:08.629679 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 13:44:08.632725 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 15 13:44:08.633680 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 15 13:44:08.635095 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 15 13:44:08.636795 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 15 13:44:08.684279 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 15 13:44:08.693278 kernel: loop0: detected capacity change from 0 to 210664 Jan 15 13:44:08.700522 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 15 13:44:08.711552 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 15 13:44:08.720582 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 15 13:44:08.731374 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 13:44:08.755038 systemd-journald[1146]: Time spent on flushing to /var/log/journal/2d61c347ab1c4c4ca3ef23f6fb02c009 is 72.483ms for 1148 entries. Jan 15 13:44:08.755038 systemd-journald[1146]: System Journal (/var/log/journal/2d61c347ab1c4c4ca3ef23f6fb02c009) is 8.0M, max 584.8M, 576.8M free. Jan 15 13:44:08.891813 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 15 13:44:08.891867 systemd-journald[1146]: Received client request to flush runtime journal. Jan 15 13:44:08.894316 kernel: loop1: detected capacity change from 0 to 142488 Jan 15 13:44:08.792780 systemd-tmpfiles[1173]: ACLs are not supported, ignoring. Jan 15 13:44:08.792800 systemd-tmpfiles[1173]: ACLs are not supported, ignoring. Jan 15 13:44:08.818311 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 13:44:08.832536 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 15 13:44:08.841595 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 15 13:44:08.843664 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 15 13:44:08.868505 udevadm[1200]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 15 13:44:08.901280 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 15 13:44:08.928257 kernel: loop2: detected capacity change from 0 to 8 Jan 15 13:44:08.933435 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 15 13:44:08.949644 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 13:44:08.962405 kernel: loop3: detected capacity change from 0 to 140768 Jan 15 13:44:08.989338 systemd-tmpfiles[1216]: ACLs are not supported, ignoring. Jan 15 13:44:08.989896 systemd-tmpfiles[1216]: ACLs are not supported, ignoring. Jan 15 13:44:08.998117 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 13:44:09.029269 kernel: loop4: detected capacity change from 0 to 210664 Jan 15 13:44:09.058654 kernel: loop5: detected capacity change from 0 to 142488 Jan 15 13:44:09.088508 kernel: loop6: detected capacity change from 0 to 8 Jan 15 13:44:09.096268 kernel: loop7: detected capacity change from 0 to 140768 Jan 15 13:44:09.155213 (sd-merge)[1220]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Jan 15 13:44:09.156527 (sd-merge)[1220]: Merged extensions into '/usr'. Jan 15 13:44:09.176323 systemd[1]: Reloading requested from client PID 1172 ('systemd-sysext') (unit systemd-sysext.service)... Jan 15 13:44:09.176544 systemd[1]: Reloading... Jan 15 13:44:09.395458 zram_generator::config[1254]: No configuration found. Jan 15 13:44:09.593276 ldconfig[1168]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 15 13:44:09.732218 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 15 13:44:09.810185 systemd[1]: Reloading finished in 631 ms. Jan 15 13:44:09.847134 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 15 13:44:09.851210 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 15 13:44:09.865495 systemd[1]: Starting ensure-sysext.service... Jan 15 13:44:09.881619 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 13:44:09.901897 systemd[1]: Reloading requested from client PID 1302 ('systemctl') (unit ensure-sysext.service)... Jan 15 13:44:09.901944 systemd[1]: Reloading... Jan 15 13:44:09.941212 systemd-tmpfiles[1303]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 15 13:44:09.942017 systemd-tmpfiles[1303]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 15 13:44:09.945070 systemd-tmpfiles[1303]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 15 13:44:09.945726 systemd-tmpfiles[1303]: ACLs are not supported, ignoring. Jan 15 13:44:09.945838 systemd-tmpfiles[1303]: ACLs are not supported, ignoring. Jan 15 13:44:09.958367 systemd-tmpfiles[1303]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 13:44:09.958429 systemd-tmpfiles[1303]: Skipping /boot Jan 15 13:44:09.989593 systemd-tmpfiles[1303]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 13:44:09.989618 systemd-tmpfiles[1303]: Skipping /boot Jan 15 13:44:10.094194 zram_generator::config[1333]: No configuration found. Jan 15 13:44:10.269327 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 15 13:44:10.334593 systemd[1]: Reloading finished in 431 ms. Jan 15 13:44:10.359105 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 15 13:44:10.362884 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 13:44:10.383557 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 15 13:44:10.390498 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 15 13:44:10.394955 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 15 13:44:10.402874 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 13:44:10.413994 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 13:44:10.425510 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 15 13:44:10.439759 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:10.440056 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 13:44:10.451677 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 13:44:10.457566 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 13:44:10.461576 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 13:44:10.463101 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 13:44:10.473619 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 15 13:44:10.474501 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:10.481853 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:10.483474 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 13:44:10.483865 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 13:44:10.484099 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:10.488823 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 15 13:44:10.493330 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 15 13:44:10.507702 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 15 13:44:10.510219 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 13:44:10.511594 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 13:44:10.524801 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 15 13:44:10.526495 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 13:44:10.526734 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 13:44:10.536914 systemd[1]: Finished ensure-sysext.service. Jan 15 13:44:10.542437 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:10.542660 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 13:44:10.550453 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 13:44:10.572746 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 13:44:10.574047 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 13:44:10.580336 systemd-udevd[1397]: Using default interface naming scheme 'v255'. Jan 15 13:44:10.587513 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 15 13:44:10.589097 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 15 13:44:10.589144 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 13:44:10.590189 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 13:44:10.595618 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 13:44:10.597434 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 15 13:44:10.601831 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 13:44:10.605522 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 13:44:10.605752 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 13:44:10.607054 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 13:44:10.607320 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 13:44:10.608964 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 13:44:10.630374 augenrules[1428]: No rules Jan 15 13:44:10.633394 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 15 13:44:10.639061 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 13:44:10.648469 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 13:44:10.655000 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 15 13:44:10.856108 systemd-networkd[1436]: lo: Link UP Jan 15 13:44:10.856118 systemd-networkd[1436]: lo: Gained carrier Jan 15 13:44:10.858678 systemd-networkd[1436]: Enumeration completed Jan 15 13:44:10.859282 systemd-networkd[1436]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 13:44:10.859288 systemd-networkd[1436]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 13:44:10.865123 systemd-networkd[1436]: eth0: Link UP Jan 15 13:44:10.865130 systemd-networkd[1436]: eth0: Gained carrier Jan 15 13:44:10.865152 systemd-networkd[1436]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 15 13:44:10.876931 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 13:44:10.886450 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 15 13:44:10.897344 systemd-networkd[1436]: eth0: DHCPv4 address 10.230.66.218/30, gateway 10.230.66.217 acquired from 10.230.66.217 Jan 15 13:44:10.916454 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 15 13:44:10.935259 kernel: mousedev: PS/2 mouse device common for all mice Jan 15 13:44:10.952161 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 15 13:44:10.956690 systemd[1]: Reached target time-set.target - System Time Set. Jan 15 13:44:10.987421 systemd-resolved[1395]: Positive Trust Anchors: Jan 15 13:44:10.987445 systemd-resolved[1395]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 13:44:10.987496 systemd-resolved[1395]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 13:44:10.998253 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 15 13:44:10.998375 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1450) Jan 15 13:44:11.006577 systemd-resolved[1395]: Using system hostname 'srv-6yg2e.gb1.brightbox.com'. Jan 15 13:44:11.013986 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 13:44:11.027829 systemd[1]: Reached target network.target - Network. Jan 15 13:44:11.028777 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 13:44:11.032264 kernel: ACPI: button: Power Button [PWRF] Jan 15 13:44:11.135502 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 15 13:44:11.139714 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 15 13:44:11.139802 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Jan 15 13:44:11.140173 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 15 13:44:11.161315 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 13:44:11.170583 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 15 13:44:11.208708 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 15 13:44:11.245750 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 13:44:12.448879 systemd-timesyncd[1421]: Contacted time server 185.15.104.21:123 (0.flatcar.pool.ntp.org). Jan 15 13:44:12.449106 systemd-timesyncd[1421]: Initial clock synchronization to Wed 2025-01-15 13:44:12.448468 UTC. Jan 15 13:44:12.465727 systemd-resolved[1395]: Clock change detected. Flushing caches. Jan 15 13:44:12.566073 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 15 13:44:12.572996 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 15 13:44:12.608574 lvm[1475]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 15 13:44:12.639979 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 15 13:44:12.670499 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 13:44:12.672392 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 13:44:12.673216 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 13:44:12.674414 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 15 13:44:12.675230 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 15 13:44:12.676505 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 15 13:44:12.677398 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 15 13:44:12.678236 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 15 13:44:12.679057 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 15 13:44:12.679105 systemd[1]: Reached target paths.target - Path Units. Jan 15 13:44:12.679769 systemd[1]: Reached target timers.target - Timer Units. Jan 15 13:44:12.682539 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 15 13:44:12.685655 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 15 13:44:12.692175 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 15 13:44:12.694897 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 15 13:44:12.696342 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 15 13:44:12.697234 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 13:44:12.697975 systemd[1]: Reached target basic.target - Basic System. Jan 15 13:44:12.698727 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 15 13:44:12.698779 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 15 13:44:12.705664 systemd[1]: Starting containerd.service - containerd container runtime... Jan 15 13:44:12.711741 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 15 13:44:12.715503 lvm[1482]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 15 13:44:12.721760 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 15 13:44:12.730645 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 15 13:44:12.734701 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 15 13:44:12.735882 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 15 13:44:12.743707 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 15 13:44:12.753627 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 15 13:44:12.758937 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 15 13:44:12.767651 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 15 13:44:12.784730 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 15 13:44:12.788676 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 15 13:44:12.789617 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 15 13:44:12.792830 jq[1486]: false Jan 15 13:44:12.801662 systemd[1]: Starting update-engine.service - Update Engine... Jan 15 13:44:12.807596 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 15 13:44:12.812492 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 15 13:44:12.818526 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 15 13:44:12.819353 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 15 13:44:12.870875 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 15 13:44:12.871591 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 15 13:44:12.907467 jq[1498]: true Jan 15 13:44:12.914518 extend-filesystems[1487]: Found loop4 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found loop5 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found loop6 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found loop7 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda1 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda2 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda3 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found usr Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda4 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda6 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda7 Jan 15 13:44:12.914518 extend-filesystems[1487]: Found vda9 Jan 15 13:44:12.914518 extend-filesystems[1487]: Checking size of /dev/vda9 Jan 15 13:44:12.969505 tar[1502]: linux-amd64/helm Jan 15 13:44:12.970027 update_engine[1495]: I20250115 13:44:12.942113 1495 main.cc:92] Flatcar Update Engine starting Jan 15 13:44:12.926924 systemd[1]: motdgen.service: Deactivated successfully. Jan 15 13:44:12.927211 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 15 13:44:12.943764 (ntainerd)[1513]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 15 13:44:12.974717 dbus-daemon[1485]: [system] SELinux support is enabled Jan 15 13:44:12.976030 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 15 13:44:12.979548 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 15 13:44:12.979606 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 15 13:44:12.980927 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 15 13:44:12.980977 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 15 13:44:12.995611 systemd[1]: Started update-engine.service - Update Engine. Jan 15 13:44:13.005118 update_engine[1495]: I20250115 13:44:12.997782 1495 update_check_scheduler.cc:74] Next update check in 7m3s Jan 15 13:44:13.005204 extend-filesystems[1487]: Resized partition /dev/vda9 Jan 15 13:44:13.004688 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 15 13:44:13.008642 dbus-daemon[1485]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.3' (uid=244 pid=1436 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Jan 15 13:44:13.018465 extend-filesystems[1525]: resize2fs 1.47.1 (20-May-2024) Jan 15 13:44:13.029980 jq[1518]: true Jan 15 13:44:13.039712 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Jan 15 13:44:13.020672 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Jan 15 13:44:13.127499 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1447) Jan 15 13:44:13.196498 systemd-logind[1493]: Watching system buttons on /dev/input/event2 (Power Button) Jan 15 13:44:13.197183 systemd-logind[1493]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 15 13:44:13.197882 systemd-logind[1493]: New seat seat0. Jan 15 13:44:13.199334 systemd[1]: Started systemd-logind.service - User Login Management. Jan 15 13:44:13.414059 bash[1546]: Updated "/home/core/.ssh/authorized_keys" Jan 15 13:44:13.416189 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 15 13:44:13.458227 systemd[1]: Starting sshkeys.service... Jan 15 13:44:13.458608 systemd-networkd[1436]: eth0: Gained IPv6LL Jan 15 13:44:13.469385 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 15 13:44:13.470923 systemd[1]: Reached target network-online.target - Network is Online. Jan 15 13:44:13.484874 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:13.497694 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 15 13:44:13.536963 dbus-daemon[1485]: [system] Successfully activated service 'org.freedesktop.hostname1' Jan 15 13:44:13.543630 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Jan 15 13:44:13.546884 dbus-daemon[1485]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1526 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Jan 15 13:44:13.556936 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 15 13:44:13.565884 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 15 13:44:13.569342 systemd[1]: Starting polkit.service - Authorization Manager... Jan 15 13:44:13.574471 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Jan 15 13:44:13.587594 locksmithd[1524]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 15 13:44:13.591508 extend-filesystems[1525]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 15 13:44:13.591508 extend-filesystems[1525]: old_desc_blocks = 1, new_desc_blocks = 8 Jan 15 13:44:13.591508 extend-filesystems[1525]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Jan 15 13:44:13.606824 extend-filesystems[1487]: Resized filesystem in /dev/vda9 Jan 15 13:44:13.594456 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 15 13:44:13.594844 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 15 13:44:13.642757 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 15 13:44:13.655756 polkitd[1561]: Started polkitd version 121 Jan 15 13:44:13.678900 sshd_keygen[1515]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 15 13:44:13.688493 polkitd[1561]: Loading rules from directory /etc/polkit-1/rules.d Jan 15 13:44:13.688637 polkitd[1561]: Loading rules from directory /usr/share/polkit-1/rules.d Jan 15 13:44:13.692306 polkitd[1561]: Finished loading, compiling and executing 2 rules Jan 15 13:44:13.698605 dbus-daemon[1485]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Jan 15 13:44:13.698881 systemd[1]: Started polkit.service - Authorization Manager. Jan 15 13:44:13.701132 polkitd[1561]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Jan 15 13:44:13.836597 systemd-hostnamed[1526]: Hostname set to (static) Jan 15 13:44:13.864605 containerd[1513]: time="2025-01-15T13:44:13.856693607Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Jan 15 13:44:13.879572 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 15 13:44:13.907334 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 15 13:44:13.911980 containerd[1513]: time="2025-01-15T13:44:13.911835193Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.916816 containerd[1513]: time="2025-01-15T13:44:13.916758917Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 15 13:44:13.916816 containerd[1513]: time="2025-01-15T13:44:13.916808980Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 15 13:44:13.916983 containerd[1513]: time="2025-01-15T13:44:13.916834719Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 15 13:44:13.917616 containerd[1513]: time="2025-01-15T13:44:13.917407592Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 15 13:44:13.917616 containerd[1513]: time="2025-01-15T13:44:13.917497510Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.919778332Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.919820736Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.920172054Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.920197017Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.920216945Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.920233057Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.920391457Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.921663 containerd[1513]: time="2025-01-15T13:44:13.920887405Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 15 13:44:13.923792 containerd[1513]: time="2025-01-15T13:44:13.923584624Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 15 13:44:13.923792 containerd[1513]: time="2025-01-15T13:44:13.923619228Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 15 13:44:13.923931 containerd[1513]: time="2025-01-15T13:44:13.923798944Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 15 13:44:13.923931 containerd[1513]: time="2025-01-15T13:44:13.923918999Z" level=info msg="metadata content store policy set" policy=shared Jan 15 13:44:13.932313 containerd[1513]: time="2025-01-15T13:44:13.932259314Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 15 13:44:13.932389 containerd[1513]: time="2025-01-15T13:44:13.932362570Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 15 13:44:13.932484 containerd[1513]: time="2025-01-15T13:44:13.932393295Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 15 13:44:13.932551 containerd[1513]: time="2025-01-15T13:44:13.932490691Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 15 13:44:13.932659 containerd[1513]: time="2025-01-15T13:44:13.932518109Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 15 13:44:13.933057 containerd[1513]: time="2025-01-15T13:44:13.932998055Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 15 13:44:13.934988 containerd[1513]: time="2025-01-15T13:44:13.934931562Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 15 13:44:13.936379 containerd[1513]: time="2025-01-15T13:44:13.936314549Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 15 13:44:13.936379 containerd[1513]: time="2025-01-15T13:44:13.936369500Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 15 13:44:13.936605 containerd[1513]: time="2025-01-15T13:44:13.936393706Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.936917057Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.936993975Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937029106Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937086007Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937108825Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937146218Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937168584Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937191624Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937249027Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937301243Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937339239Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.937469 containerd[1513]: time="2025-01-15T13:44:13.937362858Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939484 containerd[1513]: time="2025-01-15T13:44:13.937425401Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939571 containerd[1513]: time="2025-01-15T13:44:13.939495350Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939571 containerd[1513]: time="2025-01-15T13:44:13.939520870Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939680 containerd[1513]: time="2025-01-15T13:44:13.939585118Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939734 containerd[1513]: time="2025-01-15T13:44:13.939709521Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939784 containerd[1513]: time="2025-01-15T13:44:13.939762234Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939824 containerd[1513]: time="2025-01-15T13:44:13.939790123Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939824 containerd[1513]: time="2025-01-15T13:44:13.939812643Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939958 containerd[1513]: time="2025-01-15T13:44:13.939833150Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.939958 containerd[1513]: time="2025-01-15T13:44:13.939921268Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 15 13:44:13.940047 containerd[1513]: time="2025-01-15T13:44:13.939976949Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.940047 containerd[1513]: time="2025-01-15T13:44:13.939999488Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.940047 containerd[1513]: time="2025-01-15T13:44:13.940028170Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 15 13:44:13.940702 containerd[1513]: time="2025-01-15T13:44:13.940189481Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941682925Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941736408Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941759252Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941774938Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941802091Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941829996Z" level=info msg="NRI interface is disabled by configuration." Jan 15 13:44:13.943759 containerd[1513]: time="2025-01-15T13:44:13.941848443Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 15 13:44:13.944073 containerd[1513]: time="2025-01-15T13:44:13.942355962Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 15 13:44:13.944073 containerd[1513]: time="2025-01-15T13:44:13.942474109Z" level=info msg="Connect containerd service" Jan 15 13:44:13.944073 containerd[1513]: time="2025-01-15T13:44:13.942565491Z" level=info msg="using legacy CRI server" Jan 15 13:44:13.944073 containerd[1513]: time="2025-01-15T13:44:13.942583633Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 15 13:44:13.944073 containerd[1513]: time="2025-01-15T13:44:13.942772490Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.946625494Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.948357166Z" level=info msg="Start subscribing containerd event" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.948454837Z" level=info msg="Start recovering state" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.948611996Z" level=info msg="Start event monitor" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.948643646Z" level=info msg="Start snapshots syncer" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.948660408Z" level=info msg="Start cni network conf syncer for default" Jan 15 13:44:13.949574 containerd[1513]: time="2025-01-15T13:44:13.948673907Z" level=info msg="Start streaming server" Jan 15 13:44:13.952466 containerd[1513]: time="2025-01-15T13:44:13.951088472Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 15 13:44:13.952466 containerd[1513]: time="2025-01-15T13:44:13.951293708Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 15 13:44:13.967268 containerd[1513]: time="2025-01-15T13:44:13.963747536Z" level=info msg="containerd successfully booted in 0.108357s" Jan 15 13:44:13.965460 systemd[1]: Started containerd.service - containerd container runtime. Jan 15 13:44:13.982165 systemd[1]: issuegen.service: Deactivated successfully. Jan 15 13:44:13.982764 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 15 13:44:13.993631 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 15 13:44:14.038150 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 15 13:44:14.054025 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 15 13:44:14.064058 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 15 13:44:14.066001 systemd[1]: Reached target getty.target - Login Prompts. Jan 15 13:44:14.562106 tar[1502]: linux-amd64/LICENSE Jan 15 13:44:14.563820 tar[1502]: linux-amd64/README.md Jan 15 13:44:14.581384 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 15 13:44:14.982371 systemd-networkd[1436]: eth0: Ignoring DHCPv6 address 2a02:1348:179:90b6:24:19ff:fee6:42da/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:90b6:24:19ff:fee6:42da/64 assigned by NDisc. Jan 15 13:44:14.982389 systemd-networkd[1436]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 15 13:44:15.243705 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:15.247038 (kubelet)[1610]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 13:44:16.121476 kubelet[1610]: E0115 13:44:16.121360 1610 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 13:44:16.124627 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 13:44:16.124886 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 13:44:16.125514 systemd[1]: kubelet.service: Consumed 1.631s CPU time. Jan 15 13:44:18.195070 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 15 13:44:18.211044 systemd[1]: Started sshd@0-10.230.66.218:22-147.75.109.163:40402.service - OpenSSH per-connection server daemon (147.75.109.163:40402). Jan 15 13:44:19.130659 sshd[1621]: Accepted publickey for core from 147.75.109.163 port 40402 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:19.137984 sshd[1621]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:19.144911 login[1598]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 15 13:44:19.147889 login[1599]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 15 13:44:19.165836 systemd-logind[1493]: New session 1 of user core. Jan 15 13:44:19.169346 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 15 13:44:19.177199 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 15 13:44:19.181997 systemd-logind[1493]: New session 3 of user core. Jan 15 13:44:19.192079 systemd-logind[1493]: New session 2 of user core. Jan 15 13:44:19.207886 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 15 13:44:19.216063 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 15 13:44:19.228590 (systemd)[1629]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 15 13:44:19.370652 systemd[1629]: Queued start job for default target default.target. Jan 15 13:44:19.378500 systemd[1629]: Created slice app.slice - User Application Slice. Jan 15 13:44:19.378545 systemd[1629]: Reached target paths.target - Paths. Jan 15 13:44:19.378568 systemd[1629]: Reached target timers.target - Timers. Jan 15 13:44:19.380956 systemd[1629]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 15 13:44:19.398271 systemd[1629]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 15 13:44:19.398509 systemd[1629]: Reached target sockets.target - Sockets. Jan 15 13:44:19.398536 systemd[1629]: Reached target basic.target - Basic System. Jan 15 13:44:19.398605 systemd[1629]: Reached target default.target - Main User Target. Jan 15 13:44:19.398673 systemd[1629]: Startup finished in 160ms. Jan 15 13:44:19.398910 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 15 13:44:19.412046 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 15 13:44:19.415522 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 15 13:44:19.418551 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 15 13:44:20.037944 coreos-metadata[1484]: Jan 15 13:44:20.037 WARN failed to locate config-drive, using the metadata service API instead Jan 15 13:44:20.062898 systemd[1]: Started sshd@1-10.230.66.218:22-147.75.109.163:40412.service - OpenSSH per-connection server daemon (147.75.109.163:40412). Jan 15 13:44:20.069752 coreos-metadata[1484]: Jan 15 13:44:20.068 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 15 13:44:20.075738 coreos-metadata[1484]: Jan 15 13:44:20.075 INFO Fetch failed with 404: resource not found Jan 15 13:44:20.076021 coreos-metadata[1484]: Jan 15 13:44:20.075 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 13:44:20.076880 coreos-metadata[1484]: Jan 15 13:44:20.076 INFO Fetch successful Jan 15 13:44:20.077160 coreos-metadata[1484]: Jan 15 13:44:20.077 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 15 13:44:20.091277 coreos-metadata[1484]: Jan 15 13:44:20.091 INFO Fetch successful Jan 15 13:44:20.091510 coreos-metadata[1484]: Jan 15 13:44:20.091 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 15 13:44:20.104352 coreos-metadata[1484]: Jan 15 13:44:20.104 INFO Fetch successful Jan 15 13:44:20.104701 coreos-metadata[1484]: Jan 15 13:44:20.104 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 15 13:44:20.117683 coreos-metadata[1484]: Jan 15 13:44:20.117 INFO Fetch successful Jan 15 13:44:20.117975 coreos-metadata[1484]: Jan 15 13:44:20.117 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 15 13:44:20.133505 coreos-metadata[1484]: Jan 15 13:44:20.133 INFO Fetch successful Jan 15 13:44:20.176647 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 15 13:44:20.178942 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 15 13:44:20.743939 coreos-metadata[1559]: Jan 15 13:44:20.743 WARN failed to locate config-drive, using the metadata service API instead Jan 15 13:44:20.767397 coreos-metadata[1559]: Jan 15 13:44:20.767 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 15 13:44:20.791494 coreos-metadata[1559]: Jan 15 13:44:20.791 INFO Fetch successful Jan 15 13:44:20.791717 coreos-metadata[1559]: Jan 15 13:44:20.791 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 15 13:44:20.822484 coreos-metadata[1559]: Jan 15 13:44:20.822 INFO Fetch successful Jan 15 13:44:20.827340 unknown[1559]: wrote ssh authorized keys file for user: core Jan 15 13:44:20.856045 update-ssh-keys[1673]: Updated "/home/core/.ssh/authorized_keys" Jan 15 13:44:20.857788 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 15 13:44:20.861099 systemd[1]: Finished sshkeys.service. Jan 15 13:44:20.862900 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 15 13:44:20.863386 systemd[1]: Startup finished in 1.556s (kernel) + 15.425s (initrd) + 12.556s (userspace) = 29.537s. Jan 15 13:44:20.947093 sshd[1664]: Accepted publickey for core from 147.75.109.163 port 40412 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:20.950140 sshd[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:20.957584 systemd-logind[1493]: New session 4 of user core. Jan 15 13:44:20.969677 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 15 13:44:21.566060 sshd[1664]: pam_unix(sshd:session): session closed for user core Jan 15 13:44:21.571978 systemd[1]: sshd@1-10.230.66.218:22-147.75.109.163:40412.service: Deactivated successfully. Jan 15 13:44:21.574593 systemd[1]: session-4.scope: Deactivated successfully. Jan 15 13:44:21.575513 systemd-logind[1493]: Session 4 logged out. Waiting for processes to exit. Jan 15 13:44:21.577456 systemd-logind[1493]: Removed session 4. Jan 15 13:44:21.728146 systemd[1]: Started sshd@2-10.230.66.218:22-147.75.109.163:40420.service - OpenSSH per-connection server daemon (147.75.109.163:40420). Jan 15 13:44:22.608973 sshd[1682]: Accepted publickey for core from 147.75.109.163 port 40420 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:22.611327 sshd[1682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:22.620160 systemd-logind[1493]: New session 5 of user core. Jan 15 13:44:22.625666 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 15 13:44:23.222838 sshd[1682]: pam_unix(sshd:session): session closed for user core Jan 15 13:44:23.228008 systemd[1]: sshd@2-10.230.66.218:22-147.75.109.163:40420.service: Deactivated successfully. Jan 15 13:44:23.230736 systemd[1]: session-5.scope: Deactivated successfully. Jan 15 13:44:23.231669 systemd-logind[1493]: Session 5 logged out. Waiting for processes to exit. Jan 15 13:44:23.233156 systemd-logind[1493]: Removed session 5. Jan 15 13:44:23.380846 systemd[1]: Started sshd@3-10.230.66.218:22-147.75.109.163:40426.service - OpenSSH per-connection server daemon (147.75.109.163:40426). Jan 15 13:44:24.290348 sshd[1689]: Accepted publickey for core from 147.75.109.163 port 40426 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:24.292845 sshd[1689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:24.299228 systemd-logind[1493]: New session 6 of user core. Jan 15 13:44:24.308695 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 15 13:44:24.911326 sshd[1689]: pam_unix(sshd:session): session closed for user core Jan 15 13:44:24.915936 systemd[1]: sshd@3-10.230.66.218:22-147.75.109.163:40426.service: Deactivated successfully. Jan 15 13:44:24.918720 systemd[1]: session-6.scope: Deactivated successfully. Jan 15 13:44:24.921981 systemd-logind[1493]: Session 6 logged out. Waiting for processes to exit. Jan 15 13:44:24.924346 systemd-logind[1493]: Removed session 6. Jan 15 13:44:25.081272 systemd[1]: Started sshd@4-10.230.66.218:22-147.75.109.163:40432.service - OpenSSH per-connection server daemon (147.75.109.163:40432). Jan 15 13:44:25.962115 sshd[1696]: Accepted publickey for core from 147.75.109.163 port 40432 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:25.964371 sshd[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:25.971196 systemd-logind[1493]: New session 7 of user core. Jan 15 13:44:25.983948 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 15 13:44:26.375401 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 15 13:44:26.382752 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:26.475772 sudo[1702]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 15 13:44:26.476272 sudo[1702]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 13:44:26.510686 sudo[1702]: pam_unix(sudo:session): session closed for user root Jan 15 13:44:26.655736 sshd[1696]: pam_unix(sshd:session): session closed for user core Jan 15 13:44:26.662316 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:26.663308 systemd[1]: sshd@4-10.230.66.218:22-147.75.109.163:40432.service: Deactivated successfully. Jan 15 13:44:26.666758 systemd[1]: session-7.scope: Deactivated successfully. Jan 15 13:44:26.669072 systemd-logind[1493]: Session 7 logged out. Waiting for processes to exit. Jan 15 13:44:26.675002 (kubelet)[1708]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 13:44:26.679682 systemd-logind[1493]: Removed session 7. Jan 15 13:44:26.755116 kubelet[1708]: E0115 13:44:26.755020 1708 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 13:44:26.760489 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 13:44:26.760756 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 13:44:26.818908 systemd[1]: Started sshd@5-10.230.66.218:22-147.75.109.163:40444.service - OpenSSH per-connection server daemon (147.75.109.163:40444). Jan 15 13:44:27.703026 sshd[1720]: Accepted publickey for core from 147.75.109.163 port 40444 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:27.705356 sshd[1720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:27.713096 systemd-logind[1493]: New session 8 of user core. Jan 15 13:44:27.723761 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 15 13:44:28.180745 sudo[1724]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 15 13:44:28.181207 sudo[1724]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 13:44:28.187709 sudo[1724]: pam_unix(sudo:session): session closed for user root Jan 15 13:44:28.196003 sudo[1723]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Jan 15 13:44:28.196485 sudo[1723]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 13:44:28.214799 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Jan 15 13:44:28.219958 auditctl[1727]: No rules Jan 15 13:44:28.221950 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 13:44:28.222348 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Jan 15 13:44:28.227869 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 15 13:44:28.273104 augenrules[1745]: No rules Jan 15 13:44:28.274596 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 15 13:44:28.278098 sudo[1723]: pam_unix(sudo:session): session closed for user root Jan 15 13:44:28.424275 sshd[1720]: pam_unix(sshd:session): session closed for user core Jan 15 13:44:28.428559 systemd-logind[1493]: Session 8 logged out. Waiting for processes to exit. Jan 15 13:44:28.429664 systemd[1]: sshd@5-10.230.66.218:22-147.75.109.163:40444.service: Deactivated successfully. Jan 15 13:44:28.432194 systemd[1]: session-8.scope: Deactivated successfully. Jan 15 13:44:28.434753 systemd-logind[1493]: Removed session 8. Jan 15 13:44:28.588930 systemd[1]: Started sshd@6-10.230.66.218:22-147.75.109.163:41672.service - OpenSSH per-connection server daemon (147.75.109.163:41672). Jan 15 13:44:29.472710 sshd[1753]: Accepted publickey for core from 147.75.109.163 port 41672 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:44:29.475179 sshd[1753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:44:29.483660 systemd-logind[1493]: New session 9 of user core. Jan 15 13:44:29.489754 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 15 13:44:29.953254 sudo[1756]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 15 13:44:29.954018 sudo[1756]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 13:44:30.671844 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 15 13:44:30.689144 (dockerd)[1772]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 15 13:44:31.401326 dockerd[1772]: time="2025-01-15T13:44:31.401220870Z" level=info msg="Starting up" Jan 15 13:44:31.598900 dockerd[1772]: time="2025-01-15T13:44:31.598823371Z" level=info msg="Loading containers: start." Jan 15 13:44:31.763748 kernel: Initializing XFRM netlink socket Jan 15 13:44:31.891673 systemd-networkd[1436]: docker0: Link UP Jan 15 13:44:31.908771 dockerd[1772]: time="2025-01-15T13:44:31.908662224Z" level=info msg="Loading containers: done." Jan 15 13:44:31.929904 dockerd[1772]: time="2025-01-15T13:44:31.929658644Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 15 13:44:31.929904 dockerd[1772]: time="2025-01-15T13:44:31.929811427Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Jan 15 13:44:31.930412 dockerd[1772]: time="2025-01-15T13:44:31.930252516Z" level=info msg="Daemon has completed initialization" Jan 15 13:44:31.976516 dockerd[1772]: time="2025-01-15T13:44:31.976271590Z" level=info msg="API listen on /run/docker.sock" Jan 15 13:44:31.976972 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 15 13:44:33.638114 containerd[1513]: time="2025-01-15T13:44:33.637920254Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.8\"" Jan 15 13:44:34.449378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1547516493.mount: Deactivated successfully. Jan 15 13:44:36.875237 containerd[1513]: time="2025-01-15T13:44:36.875013874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:36.876395 containerd[1513]: time="2025-01-15T13:44:36.876315958Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.8: active requests=0, bytes read=32675650" Jan 15 13:44:36.877763 containerd[1513]: time="2025-01-15T13:44:36.877693429Z" level=info msg="ImageCreate event name:\"sha256:772392d372035bf92e430e758ad0446146d82b7192358c8651252e4fb49c43dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:36.887522 containerd[1513]: time="2025-01-15T13:44:36.886779331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:f0e1b3de0c2e98e6c6abd73edf9d3b8e4d44460656cde0ebb92e2d9206961fcb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:36.889588 containerd[1513]: time="2025-01-15T13:44:36.888365885Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.8\" with image id \"sha256:772392d372035bf92e430e758ad0446146d82b7192358c8651252e4fb49c43dd\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:f0e1b3de0c2e98e6c6abd73edf9d3b8e4d44460656cde0ebb92e2d9206961fcb\", size \"32672442\" in 3.24915385s" Jan 15 13:44:36.889588 containerd[1513]: time="2025-01-15T13:44:36.888548194Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.8\" returns image reference \"sha256:772392d372035bf92e430e758ad0446146d82b7192358c8651252e4fb49c43dd\"" Jan 15 13:44:36.926799 containerd[1513]: time="2025-01-15T13:44:36.926724653Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.8\"" Jan 15 13:44:37.011768 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 15 13:44:37.021765 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:37.371702 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:37.374505 (kubelet)[1988]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 13:44:37.458841 kubelet[1988]: E0115 13:44:37.458640 1988 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 13:44:37.461761 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 13:44:37.462042 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 13:44:39.762859 containerd[1513]: time="2025-01-15T13:44:39.762608806Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:39.764157 containerd[1513]: time="2025-01-15T13:44:39.764098780Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.8: active requests=0, bytes read=29606417" Jan 15 13:44:39.764842 containerd[1513]: time="2025-01-15T13:44:39.764777327Z" level=info msg="ImageCreate event name:\"sha256:85333d41dd3ce32d8344280c6d533d4c8f66252e4c28e332a2322ba3837f7bd6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:39.768544 containerd[1513]: time="2025-01-15T13:44:39.768471304Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:124f66b7e877eb5a80a40503057299bb60e6a5f2130905f4e3293dabf194c397\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:39.770469 containerd[1513]: time="2025-01-15T13:44:39.770291165Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.8\" with image id \"sha256:85333d41dd3ce32d8344280c6d533d4c8f66252e4c28e332a2322ba3837f7bd6\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:124f66b7e877eb5a80a40503057299bb60e6a5f2130905f4e3293dabf194c397\", size \"31051521\" in 2.843504001s" Jan 15 13:44:39.770469 containerd[1513]: time="2025-01-15T13:44:39.770339620Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.8\" returns image reference \"sha256:85333d41dd3ce32d8344280c6d533d4c8f66252e4c28e332a2322ba3837f7bd6\"" Jan 15 13:44:39.802848 containerd[1513]: time="2025-01-15T13:44:39.802783951Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.8\"" Jan 15 13:44:41.584814 containerd[1513]: time="2025-01-15T13:44:41.584718561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:41.586396 containerd[1513]: time="2025-01-15T13:44:41.586335823Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.8: active requests=0, bytes read=17783043" Jan 15 13:44:41.587549 containerd[1513]: time="2025-01-15T13:44:41.587491622Z" level=info msg="ImageCreate event name:\"sha256:eb53b988d5e03f329b5fdba21cbbbae48e1619b199689e7448095b31843b2c43\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:41.591702 containerd[1513]: time="2025-01-15T13:44:41.591624789Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c8bdeac2590c99c1a77e33995423ddb6633ff90a82a2aa455442e0a8079ef8c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:41.593455 containerd[1513]: time="2025-01-15T13:44:41.593265256Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.8\" with image id \"sha256:eb53b988d5e03f329b5fdba21cbbbae48e1619b199689e7448095b31843b2c43\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c8bdeac2590c99c1a77e33995423ddb6633ff90a82a2aa455442e0a8079ef8c7\", size \"19228165\" in 1.790418475s" Jan 15 13:44:41.593455 containerd[1513]: time="2025-01-15T13:44:41.593311930Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.8\" returns image reference \"sha256:eb53b988d5e03f329b5fdba21cbbbae48e1619b199689e7448095b31843b2c43\"" Jan 15 13:44:41.624810 containerd[1513]: time="2025-01-15T13:44:41.624755078Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.8\"" Jan 15 13:44:43.416723 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2419144061.mount: Deactivated successfully. Jan 15 13:44:44.181643 containerd[1513]: time="2025-01-15T13:44:44.181567348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:44.182769 containerd[1513]: time="2025-01-15T13:44:44.182692406Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.8: active requests=0, bytes read=29057478" Jan 15 13:44:44.183526 containerd[1513]: time="2025-01-15T13:44:44.183487280Z" level=info msg="ImageCreate event name:\"sha256:ce61fda67eb41cf09d2b984e7979e289b5042e3983ddfc67be678425632cc0d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:44.186398 containerd[1513]: time="2025-01-15T13:44:44.186360267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:f6d6be9417e22af78905000ac4fd134896bacd2188ea63c7cac8edd7a5d7e9b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:44.187750 containerd[1513]: time="2025-01-15T13:44:44.187710436Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.8\" with image id \"sha256:ce61fda67eb41cf09d2b984e7979e289b5042e3983ddfc67be678425632cc0d2\", repo tag \"registry.k8s.io/kube-proxy:v1.30.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:f6d6be9417e22af78905000ac4fd134896bacd2188ea63c7cac8edd7a5d7e9b5\", size \"29056489\" in 2.562895506s" Jan 15 13:44:44.188009 containerd[1513]: time="2025-01-15T13:44:44.187860673Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.8\" returns image reference \"sha256:ce61fda67eb41cf09d2b984e7979e289b5042e3983ddfc67be678425632cc0d2\"" Jan 15 13:44:44.217953 containerd[1513]: time="2025-01-15T13:44:44.217897519Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 15 13:44:44.861898 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2558577471.mount: Deactivated successfully. Jan 15 13:44:45.005364 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Jan 15 13:44:46.434789 containerd[1513]: time="2025-01-15T13:44:46.434385459Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:46.436044 containerd[1513]: time="2025-01-15T13:44:46.435999182Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Jan 15 13:44:46.436990 containerd[1513]: time="2025-01-15T13:44:46.436915200Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:46.440973 containerd[1513]: time="2025-01-15T13:44:46.440903793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:46.443344 containerd[1513]: time="2025-01-15T13:44:46.443305953Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.225354045s" Jan 15 13:44:46.443681 containerd[1513]: time="2025-01-15T13:44:46.443536696Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 15 13:44:46.495825 containerd[1513]: time="2025-01-15T13:44:46.495757062Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 15 13:44:47.097038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3787383813.mount: Deactivated successfully. Jan 15 13:44:47.102677 containerd[1513]: time="2025-01-15T13:44:47.102529620Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:47.104233 containerd[1513]: time="2025-01-15T13:44:47.104169858Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Jan 15 13:44:47.105425 containerd[1513]: time="2025-01-15T13:44:47.105369075Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:47.110532 containerd[1513]: time="2025-01-15T13:44:47.110488393Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:47.112195 containerd[1513]: time="2025-01-15T13:44:47.111976149Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 616.163656ms" Jan 15 13:44:47.112195 containerd[1513]: time="2025-01-15T13:44:47.112036795Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Jan 15 13:44:47.142363 containerd[1513]: time="2025-01-15T13:44:47.141984120Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Jan 15 13:44:47.711473 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 15 13:44:47.720690 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:47.793347 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1759930576.mount: Deactivated successfully. Jan 15 13:44:47.988657 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:47.998998 (kubelet)[2097]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 13:44:48.153085 kubelet[2097]: E0115 13:44:48.152877 2097 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 13:44:48.155581 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 13:44:48.155821 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 13:44:52.591522 containerd[1513]: time="2025-01-15T13:44:52.591149239Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:52.594938 containerd[1513]: time="2025-01-15T13:44:52.594864682Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" Jan 15 13:44:52.596607 containerd[1513]: time="2025-01-15T13:44:52.596540320Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:52.600755 containerd[1513]: time="2025-01-15T13:44:52.600690895Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:44:52.603577 containerd[1513]: time="2025-01-15T13:44:52.602509555Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 5.46047283s" Jan 15 13:44:52.603577 containerd[1513]: time="2025-01-15T13:44:52.602559063Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Jan 15 13:44:56.608173 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:56.626988 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:56.663944 systemd[1]: Reloading requested from client PID 2209 ('systemctl') (unit session-9.scope)... Jan 15 13:44:56.664043 systemd[1]: Reloading... Jan 15 13:44:56.853092 zram_generator::config[2244]: No configuration found. Jan 15 13:44:57.041614 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 15 13:44:57.151477 systemd[1]: Reloading finished in 486 ms. Jan 15 13:44:57.247981 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:57.250920 systemd[1]: kubelet.service: Deactivated successfully. Jan 15 13:44:57.251337 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:57.258029 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:44:57.405297 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:44:57.419989 (kubelet)[2317]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 13:44:57.521642 kubelet[2317]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 13:44:57.521642 kubelet[2317]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 15 13:44:57.521642 kubelet[2317]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 13:44:57.524293 kubelet[2317]: I0115 13:44:57.524199 2317 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 13:44:57.944500 kubelet[2317]: I0115 13:44:57.944380 2317 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 15 13:44:57.944500 kubelet[2317]: I0115 13:44:57.944425 2317 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 13:44:57.944802 kubelet[2317]: I0115 13:44:57.944764 2317 server.go:927] "Client rotation is on, will bootstrap in background" Jan 15 13:44:57.965475 kubelet[2317]: I0115 13:44:57.964764 2317 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 13:44:57.966257 kubelet[2317]: E0115 13:44:57.966160 2317 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.230.66.218:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:57.987326 kubelet[2317]: I0115 13:44:57.987116 2317 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 13:44:57.990831 kubelet[2317]: I0115 13:44:57.990248 2317 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 13:44:57.990831 kubelet[2317]: I0115 13:44:57.990523 2317 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-6yg2e.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 15 13:44:57.991204 kubelet[2317]: I0115 13:44:57.990848 2317 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 13:44:57.991204 kubelet[2317]: I0115 13:44:57.990867 2317 container_manager_linux.go:301] "Creating device plugin manager" Jan 15 13:44:57.992113 kubelet[2317]: I0115 13:44:57.992068 2317 state_mem.go:36] "Initialized new in-memory state store" Jan 15 13:44:57.993253 kubelet[2317]: I0115 13:44:57.993213 2317 kubelet.go:400] "Attempting to sync node with API server" Jan 15 13:44:57.994355 kubelet[2317]: I0115 13:44:57.993950 2317 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 13:44:57.994355 kubelet[2317]: I0115 13:44:57.994022 2317 kubelet.go:312] "Adding apiserver pod source" Jan 15 13:44:57.994355 kubelet[2317]: I0115 13:44:57.994075 2317 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 13:44:57.994355 kubelet[2317]: W0115 13:44:57.994068 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.66.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6yg2e.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:57.994355 kubelet[2317]: E0115 13:44:57.994155 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.230.66.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6yg2e.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:57.997351 kubelet[2317]: W0115 13:44:57.997171 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.66.218:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:57.997351 kubelet[2317]: E0115 13:44:57.997244 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.230.66.218:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:57.998603 kubelet[2317]: I0115 13:44:57.998209 2317 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 15 13:44:58.000460 kubelet[2317]: I0115 13:44:57.999958 2317 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 13:44:58.000460 kubelet[2317]: W0115 13:44:58.000170 2317 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 15 13:44:58.001508 kubelet[2317]: I0115 13:44:58.001487 2317 server.go:1264] "Started kubelet" Jan 15 13:44:58.005239 kubelet[2317]: I0115 13:44:58.004495 2317 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 13:44:58.007460 kubelet[2317]: I0115 13:44:58.006211 2317 server.go:455] "Adding debug handlers to kubelet server" Jan 15 13:44:58.007460 kubelet[2317]: I0115 13:44:58.006851 2317 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 13:44:58.007635 kubelet[2317]: E0115 13:44:58.007298 2317 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.66.218:6443/api/v1/namespaces/default/events\": dial tcp 10.230.66.218:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-6yg2e.gb1.brightbox.com.181ae1a35973d494 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-6yg2e.gb1.brightbox.com,UID:srv-6yg2e.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-6yg2e.gb1.brightbox.com,},FirstTimestamp:2025-01-15 13:44:58.001429652 +0000 UTC m=+0.575225164,LastTimestamp:2025-01-15 13:44:58.001429652 +0000 UTC m=+0.575225164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-6yg2e.gb1.brightbox.com,}" Jan 15 13:44:58.007812 kubelet[2317]: I0115 13:44:58.007732 2317 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 13:44:58.011815 kubelet[2317]: I0115 13:44:58.009963 2317 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 13:44:58.011815 kubelet[2317]: I0115 13:44:58.010239 2317 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 15 13:44:58.017405 kubelet[2317]: I0115 13:44:58.017352 2317 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 15 13:44:58.017573 kubelet[2317]: I0115 13:44:58.017552 2317 reconciler.go:26] "Reconciler: start to sync state" Jan 15 13:44:58.018396 kubelet[2317]: W0115 13:44:58.018338 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.66.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:58.018531 kubelet[2317]: E0115 13:44:58.018410 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.230.66.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:58.020196 kubelet[2317]: E0115 13:44:58.020121 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6yg2e.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.218:6443: connect: connection refused" interval="200ms" Jan 15 13:44:58.022836 kubelet[2317]: I0115 13:44:58.022810 2317 factory.go:221] Registration of the systemd container factory successfully Jan 15 13:44:58.024088 kubelet[2317]: E0115 13:44:58.024039 2317 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 13:44:58.024382 kubelet[2317]: I0115 13:44:58.024223 2317 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 13:44:58.026626 kubelet[2317]: I0115 13:44:58.026591 2317 factory.go:221] Registration of the containerd container factory successfully Jan 15 13:44:58.054203 kubelet[2317]: I0115 13:44:58.054130 2317 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 13:44:58.056240 kubelet[2317]: I0115 13:44:58.055750 2317 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 13:44:58.056240 kubelet[2317]: I0115 13:44:58.055820 2317 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 15 13:44:58.056240 kubelet[2317]: I0115 13:44:58.055862 2317 kubelet.go:2337] "Starting kubelet main sync loop" Jan 15 13:44:58.056240 kubelet[2317]: E0115 13:44:58.055948 2317 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 13:44:58.073837 kubelet[2317]: W0115 13:44:58.073726 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.66.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:58.073837 kubelet[2317]: E0115 13:44:58.073799 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.230.66.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:58.076141 kubelet[2317]: I0115 13:44:58.075791 2317 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 15 13:44:58.076141 kubelet[2317]: I0115 13:44:58.075815 2317 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 15 13:44:58.076141 kubelet[2317]: I0115 13:44:58.075850 2317 state_mem.go:36] "Initialized new in-memory state store" Jan 15 13:44:58.077841 kubelet[2317]: I0115 13:44:58.077819 2317 policy_none.go:49] "None policy: Start" Jan 15 13:44:58.078753 kubelet[2317]: I0115 13:44:58.078716 2317 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 15 13:44:58.078833 kubelet[2317]: I0115 13:44:58.078762 2317 state_mem.go:35] "Initializing new in-memory state store" Jan 15 13:44:58.088908 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 15 13:44:58.108393 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 15 13:44:58.114816 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 15 13:44:58.123783 kubelet[2317]: I0115 13:44:58.123736 2317 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 13:44:58.124256 kubelet[2317]: I0115 13:44:58.124139 2317 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 13:44:58.124589 kubelet[2317]: I0115 13:44:58.124560 2317 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 13:44:58.126832 kubelet[2317]: I0115 13:44:58.126124 2317 kubelet_node_status.go:73] "Attempting to register node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.128148 kubelet[2317]: E0115 13:44:58.128103 2317 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.66.218:6443/api/v1/nodes\": dial tcp 10.230.66.218:6443: connect: connection refused" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.128722 kubelet[2317]: E0115 13:44:58.128692 2317 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-6yg2e.gb1.brightbox.com\" not found" Jan 15 13:44:58.156851 kubelet[2317]: I0115 13:44:58.156721 2317 topology_manager.go:215] "Topology Admit Handler" podUID="4b307409f158a6b009e337a8e6dda4f8" podNamespace="kube-system" podName="kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.160450 kubelet[2317]: I0115 13:44:58.160078 2317 topology_manager.go:215] "Topology Admit Handler" podUID="e41e3550d165c0102abe7a0d076c72f4" podNamespace="kube-system" podName="kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.162747 kubelet[2317]: I0115 13:44:58.162705 2317 topology_manager.go:215] "Topology Admit Handler" podUID="b9cc2cf83c8bf4eb107c225fbbd2a636" podNamespace="kube-system" podName="kube-scheduler-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.172806 systemd[1]: Created slice kubepods-burstable-pod4b307409f158a6b009e337a8e6dda4f8.slice - libcontainer container kubepods-burstable-pod4b307409f158a6b009e337a8e6dda4f8.slice. Jan 15 13:44:58.190314 systemd[1]: Created slice kubepods-burstable-pode41e3550d165c0102abe7a0d076c72f4.slice - libcontainer container kubepods-burstable-pode41e3550d165c0102abe7a0d076c72f4.slice. Jan 15 13:44:58.198670 systemd[1]: Created slice kubepods-burstable-podb9cc2cf83c8bf4eb107c225fbbd2a636.slice - libcontainer container kubepods-burstable-podb9cc2cf83c8bf4eb107c225fbbd2a636.slice. Jan 15 13:44:58.220977 kubelet[2317]: E0115 13:44:58.220875 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6yg2e.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.218:6443: connect: connection refused" interval="400ms" Jan 15 13:44:58.319859 kubelet[2317]: I0115 13:44:58.319703 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-flexvolume-dir\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.319859 kubelet[2317]: I0115 13:44:58.319793 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-k8s-certs\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.319859 kubelet[2317]: I0115 13:44:58.319841 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b9cc2cf83c8bf4eb107c225fbbd2a636-kubeconfig\") pod \"kube-scheduler-srv-6yg2e.gb1.brightbox.com\" (UID: \"b9cc2cf83c8bf4eb107c225fbbd2a636\") " pod="kube-system/kube-scheduler-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.319859 kubelet[2317]: I0115 13:44:58.319870 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-ca-certs\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.320367 kubelet[2317]: I0115 13:44:58.319901 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-kubeconfig\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.320367 kubelet[2317]: I0115 13:44:58.319935 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.320367 kubelet[2317]: I0115 13:44:58.319969 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b307409f158a6b009e337a8e6dda4f8-ca-certs\") pod \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" (UID: \"4b307409f158a6b009e337a8e6dda4f8\") " pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.320367 kubelet[2317]: I0115 13:44:58.319995 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b307409f158a6b009e337a8e6dda4f8-k8s-certs\") pod \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" (UID: \"4b307409f158a6b009e337a8e6dda4f8\") " pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.320367 kubelet[2317]: I0115 13:44:58.320033 2317 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b307409f158a6b009e337a8e6dda4f8-usr-share-ca-certificates\") pod \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" (UID: \"4b307409f158a6b009e337a8e6dda4f8\") " pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.327270 update_engine[1495]: I20250115 13:44:58.326952 1495 update_attempter.cc:509] Updating boot flags... Jan 15 13:44:58.332320 kubelet[2317]: I0115 13:44:58.331733 2317 kubelet_node_status.go:73] "Attempting to register node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.332320 kubelet[2317]: E0115 13:44:58.332199 2317 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.66.218:6443/api/v1/nodes\": dial tcp 10.230.66.218:6443: connect: connection refused" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.419572 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (2355) Jan 15 13:44:58.494508 containerd[1513]: time="2025-01-15T13:44:58.494202319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-6yg2e.gb1.brightbox.com,Uid:4b307409f158a6b009e337a8e6dda4f8,Namespace:kube-system,Attempt:0,}" Jan 15 13:44:58.502726 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (2356) Jan 15 13:44:58.502864 containerd[1513]: time="2025-01-15T13:44:58.498380418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-6yg2e.gb1.brightbox.com,Uid:e41e3550d165c0102abe7a0d076c72f4,Namespace:kube-system,Attempt:0,}" Jan 15 13:44:58.506762 containerd[1513]: time="2025-01-15T13:44:58.506163144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-6yg2e.gb1.brightbox.com,Uid:b9cc2cf83c8bf4eb107c225fbbd2a636,Namespace:kube-system,Attempt:0,}" Jan 15 13:44:58.621922 kubelet[2317]: E0115 13:44:58.621803 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6yg2e.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.218:6443: connect: connection refused" interval="800ms" Jan 15 13:44:58.737075 kubelet[2317]: I0115 13:44:58.736988 2317 kubelet_node_status.go:73] "Attempting to register node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.737669 kubelet[2317]: E0115 13:44:58.737589 2317 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.66.218:6443/api/v1/nodes\": dial tcp 10.230.66.218:6443: connect: connection refused" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:58.893350 kubelet[2317]: W0115 13:44:58.893142 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.66.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6yg2e.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:58.893350 kubelet[2317]: E0115 13:44:58.893244 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.230.66.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6yg2e.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.137180 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3902681668.mount: Deactivated successfully. Jan 15 13:44:59.146691 containerd[1513]: time="2025-01-15T13:44:59.146431806Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 13:44:59.148405 containerd[1513]: time="2025-01-15T13:44:59.148319203Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 15 13:44:59.149255 containerd[1513]: time="2025-01-15T13:44:59.149190725Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 13:44:59.151606 containerd[1513]: time="2025-01-15T13:44:59.151344077Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 13:44:59.152651 containerd[1513]: time="2025-01-15T13:44:59.152518167Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 15 13:44:59.153643 containerd[1513]: time="2025-01-15T13:44:59.153591387Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Jan 15 13:44:59.153812 containerd[1513]: time="2025-01-15T13:44:59.153782374Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 13:44:59.158868 containerd[1513]: time="2025-01-15T13:44:59.158810291Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 13:44:59.162330 containerd[1513]: time="2025-01-15T13:44:59.161250627Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 662.754242ms" Jan 15 13:44:59.163977 containerd[1513]: time="2025-01-15T13:44:59.163813098Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 657.542312ms" Jan 15 13:44:59.166386 containerd[1513]: time="2025-01-15T13:44:59.166322760Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 669.907ms" Jan 15 13:44:59.306065 kubelet[2317]: W0115 13:44:59.305854 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.66.218:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.306065 kubelet[2317]: E0115 13:44:59.306021 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.230.66.218:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.424613 kubelet[2317]: E0115 13:44:59.423331 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.66.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6yg2e.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.66.218:6443: connect: connection refused" interval="1.6s" Jan 15 13:44:59.427679 kubelet[2317]: W0115 13:44:59.427508 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.66.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.427679 kubelet[2317]: E0115 13:44:59.427606 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.230.66.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.489725 containerd[1513]: time="2025-01-15T13:44:59.489292810Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:44:59.489725 containerd[1513]: time="2025-01-15T13:44:59.489378369Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:44:59.489725 containerd[1513]: time="2025-01-15T13:44:59.489396042Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:44:59.489725 containerd[1513]: time="2025-01-15T13:44:59.489530948Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:44:59.492278 containerd[1513]: time="2025-01-15T13:44:59.491997952Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:44:59.492278 containerd[1513]: time="2025-01-15T13:44:59.492075351Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:44:59.492278 containerd[1513]: time="2025-01-15T13:44:59.492098639Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:44:59.492278 containerd[1513]: time="2025-01-15T13:44:59.492201748Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:44:59.499220 containerd[1513]: time="2025-01-15T13:44:59.498681356Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:44:59.499220 containerd[1513]: time="2025-01-15T13:44:59.498816400Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:44:59.499220 containerd[1513]: time="2025-01-15T13:44:59.498852058Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:44:59.499220 containerd[1513]: time="2025-01-15T13:44:59.499059277Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:44:59.544336 kubelet[2317]: I0115 13:44:59.543521 2317 kubelet_node_status.go:73] "Attempting to register node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:59.544336 kubelet[2317]: E0115 13:44:59.544083 2317 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.66.218:6443/api/v1/nodes\": dial tcp 10.230.66.218:6443: connect: connection refused" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:44:59.554666 kubelet[2317]: W0115 13:44:59.553531 2317 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.66.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.554666 kubelet[2317]: E0115 13:44:59.554596 2317 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.230.66.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:44:59.586244 systemd[1]: Started cri-containerd-54c7ff365d416145879dd09712ee2ef3e9b86ff51cd9b19f9963f63ea3cd0c5a.scope - libcontainer container 54c7ff365d416145879dd09712ee2ef3e9b86ff51cd9b19f9963f63ea3cd0c5a. Jan 15 13:44:59.590535 systemd[1]: Started cri-containerd-d90f388f48306868adf884461f1ec0bb310be3589876609011f9e26119d90c0c.scope - libcontainer container d90f388f48306868adf884461f1ec0bb310be3589876609011f9e26119d90c0c. Jan 15 13:44:59.609607 systemd[1]: Started cri-containerd-4853bcb8324e344a6fe99fc8f4464d869f31760566702e954f2b236075bf4e45.scope - libcontainer container 4853bcb8324e344a6fe99fc8f4464d869f31760566702e954f2b236075bf4e45. Jan 15 13:44:59.742970 containerd[1513]: time="2025-01-15T13:44:59.742798483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-6yg2e.gb1.brightbox.com,Uid:b9cc2cf83c8bf4eb107c225fbbd2a636,Namespace:kube-system,Attempt:0,} returns sandbox id \"54c7ff365d416145879dd09712ee2ef3e9b86ff51cd9b19f9963f63ea3cd0c5a\"" Jan 15 13:44:59.753488 containerd[1513]: time="2025-01-15T13:44:59.753260231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-6yg2e.gb1.brightbox.com,Uid:e41e3550d165c0102abe7a0d076c72f4,Namespace:kube-system,Attempt:0,} returns sandbox id \"d90f388f48306868adf884461f1ec0bb310be3589876609011f9e26119d90c0c\"" Jan 15 13:44:59.756460 containerd[1513]: time="2025-01-15T13:44:59.755693739Z" level=info msg="CreateContainer within sandbox \"54c7ff365d416145879dd09712ee2ef3e9b86ff51cd9b19f9963f63ea3cd0c5a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 15 13:44:59.759151 containerd[1513]: time="2025-01-15T13:44:59.759111208Z" level=info msg="CreateContainer within sandbox \"d90f388f48306868adf884461f1ec0bb310be3589876609011f9e26119d90c0c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 15 13:44:59.764994 containerd[1513]: time="2025-01-15T13:44:59.764945822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-6yg2e.gb1.brightbox.com,Uid:4b307409f158a6b009e337a8e6dda4f8,Namespace:kube-system,Attempt:0,} returns sandbox id \"4853bcb8324e344a6fe99fc8f4464d869f31760566702e954f2b236075bf4e45\"" Jan 15 13:44:59.772093 containerd[1513]: time="2025-01-15T13:44:59.772056731Z" level=info msg="CreateContainer within sandbox \"4853bcb8324e344a6fe99fc8f4464d869f31760566702e954f2b236075bf4e45\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 15 13:44:59.795419 containerd[1513]: time="2025-01-15T13:44:59.795352314Z" level=info msg="CreateContainer within sandbox \"54c7ff365d416145879dd09712ee2ef3e9b86ff51cd9b19f9963f63ea3cd0c5a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"cebc95debceedbd4d4008627fb8fd932f743e1c5c24f2e09cc4c9f60371f3a47\"" Jan 15 13:44:59.797033 containerd[1513]: time="2025-01-15T13:44:59.796996758Z" level=info msg="CreateContainer within sandbox \"d90f388f48306868adf884461f1ec0bb310be3589876609011f9e26119d90c0c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7b7b3fe3881be3642e35686792a7c068c93f4a5e43db304b0977f48390ca9d90\"" Jan 15 13:44:59.798140 containerd[1513]: time="2025-01-15T13:44:59.798105694Z" level=info msg="StartContainer for \"7b7b3fe3881be3642e35686792a7c068c93f4a5e43db304b0977f48390ca9d90\"" Jan 15 13:44:59.798473 containerd[1513]: time="2025-01-15T13:44:59.798324771Z" level=info msg="StartContainer for \"cebc95debceedbd4d4008627fb8fd932f743e1c5c24f2e09cc4c9f60371f3a47\"" Jan 15 13:44:59.800876 containerd[1513]: time="2025-01-15T13:44:59.800839876Z" level=info msg="CreateContainer within sandbox \"4853bcb8324e344a6fe99fc8f4464d869f31760566702e954f2b236075bf4e45\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"c43194dc1691526341ea6772d6fe549917f45ea75b3d1d9a3836e5ae71f96279\"" Jan 15 13:44:59.802463 containerd[1513]: time="2025-01-15T13:44:59.801406130Z" level=info msg="StartContainer for \"c43194dc1691526341ea6772d6fe549917f45ea75b3d1d9a3836e5ae71f96279\"" Jan 15 13:44:59.852720 systemd[1]: Started cri-containerd-7b7b3fe3881be3642e35686792a7c068c93f4a5e43db304b0977f48390ca9d90.scope - libcontainer container 7b7b3fe3881be3642e35686792a7c068c93f4a5e43db304b0977f48390ca9d90. Jan 15 13:44:59.867107 systemd[1]: Started cri-containerd-c43194dc1691526341ea6772d6fe549917f45ea75b3d1d9a3836e5ae71f96279.scope - libcontainer container c43194dc1691526341ea6772d6fe549917f45ea75b3d1d9a3836e5ae71f96279. Jan 15 13:44:59.877707 systemd[1]: Started cri-containerd-cebc95debceedbd4d4008627fb8fd932f743e1c5c24f2e09cc4c9f60371f3a47.scope - libcontainer container cebc95debceedbd4d4008627fb8fd932f743e1c5c24f2e09cc4c9f60371f3a47. Jan 15 13:44:59.993876 containerd[1513]: time="2025-01-15T13:44:59.993300316Z" level=info msg="StartContainer for \"7b7b3fe3881be3642e35686792a7c068c93f4a5e43db304b0977f48390ca9d90\" returns successfully" Jan 15 13:45:00.007757 containerd[1513]: time="2025-01-15T13:45:00.007682026Z" level=info msg="StartContainer for \"c43194dc1691526341ea6772d6fe549917f45ea75b3d1d9a3836e5ae71f96279\" returns successfully" Jan 15 13:45:00.023005 containerd[1513]: time="2025-01-15T13:45:00.022925141Z" level=info msg="StartContainer for \"cebc95debceedbd4d4008627fb8fd932f743e1c5c24f2e09cc4c9f60371f3a47\" returns successfully" Jan 15 13:45:00.133467 kubelet[2317]: E0115 13:45:00.131748 2317 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.230.66.218:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.230.66.218:6443: connect: connection refused Jan 15 13:45:01.152830 kubelet[2317]: I0115 13:45:01.152715 2317 kubelet_node_status.go:73] "Attempting to register node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:02.785137 kubelet[2317]: E0115 13:45:02.785064 2317 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-6yg2e.gb1.brightbox.com\" not found" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:02.902761 kubelet[2317]: I0115 13:45:02.902330 2317 kubelet_node_status.go:76] "Successfully registered node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:02.906503 kubelet[2317]: E0115 13:45:02.906349 2317 kubelet_node_status.go:544] "Error updating node status, will retry" err="error getting node \"srv-6yg2e.gb1.brightbox.com\": nodes \"srv-6yg2e.gb1.brightbox.com\" not found" Jan 15 13:45:02.997970 kubelet[2317]: I0115 13:45:02.997837 2317 apiserver.go:52] "Watching apiserver" Jan 15 13:45:03.018175 kubelet[2317]: I0115 13:45:03.018086 2317 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 15 13:45:04.953917 systemd[1]: Reloading requested from client PID 2608 ('systemctl') (unit session-9.scope)... Jan 15 13:45:04.953965 systemd[1]: Reloading... Jan 15 13:45:05.010719 kubelet[2317]: W0115 13:45:05.010651 2317 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 15 13:45:05.087510 zram_generator::config[2643]: No configuration found. Jan 15 13:45:05.276588 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 15 13:45:05.409677 systemd[1]: Reloading finished in 454 ms. Jan 15 13:45:05.482478 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:45:05.494392 systemd[1]: kubelet.service: Deactivated successfully. Jan 15 13:45:05.495102 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:45:05.495208 systemd[1]: kubelet.service: Consumed 1.109s CPU time, 110.7M memory peak, 0B memory swap peak. Jan 15 13:45:05.501900 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 13:45:05.702250 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 13:45:05.714965 (kubelet)[2711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 13:45:05.824492 kubelet[2711]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 13:45:05.824492 kubelet[2711]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 15 13:45:05.824492 kubelet[2711]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 13:45:05.824492 kubelet[2711]: I0115 13:45:05.824246 2711 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 13:45:05.833224 kubelet[2711]: I0115 13:45:05.833162 2711 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 15 13:45:05.833224 kubelet[2711]: I0115 13:45:05.833194 2711 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 13:45:05.833541 kubelet[2711]: I0115 13:45:05.833456 2711 server.go:927] "Client rotation is on, will bootstrap in background" Jan 15 13:45:05.835292 kubelet[2711]: I0115 13:45:05.835261 2711 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 15 13:45:05.837284 kubelet[2711]: I0115 13:45:05.836809 2711 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 13:45:05.850298 kubelet[2711]: I0115 13:45:05.850028 2711 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 13:45:05.850595 kubelet[2711]: I0115 13:45:05.850405 2711 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 13:45:05.850905 kubelet[2711]: I0115 13:45:05.850494 2711 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-6yg2e.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 15 13:45:05.851059 kubelet[2711]: I0115 13:45:05.850929 2711 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 13:45:05.851059 kubelet[2711]: I0115 13:45:05.850947 2711 container_manager_linux.go:301] "Creating device plugin manager" Jan 15 13:45:05.851059 kubelet[2711]: I0115 13:45:05.851000 2711 state_mem.go:36] "Initialized new in-memory state store" Jan 15 13:45:05.851266 kubelet[2711]: I0115 13:45:05.851190 2711 kubelet.go:400] "Attempting to sync node with API server" Jan 15 13:45:05.851266 kubelet[2711]: I0115 13:45:05.851218 2711 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 13:45:05.851266 kubelet[2711]: I0115 13:45:05.851250 2711 kubelet.go:312] "Adding apiserver pod source" Jan 15 13:45:05.852243 kubelet[2711]: I0115 13:45:05.851274 2711 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 13:45:05.853240 kubelet[2711]: I0115 13:45:05.853202 2711 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 15 13:45:05.855531 kubelet[2711]: I0115 13:45:05.855506 2711 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 13:45:05.856154 kubelet[2711]: I0115 13:45:05.856129 2711 server.go:1264] "Started kubelet" Jan 15 13:45:05.859776 kubelet[2711]: I0115 13:45:05.859748 2711 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 13:45:05.873841 kubelet[2711]: I0115 13:45:05.873062 2711 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 13:45:05.875661 kubelet[2711]: I0115 13:45:05.874639 2711 server.go:455] "Adding debug handlers to kubelet server" Jan 15 13:45:05.877299 kubelet[2711]: I0115 13:45:05.875959 2711 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 13:45:05.881452 kubelet[2711]: I0115 13:45:05.877799 2711 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 13:45:05.881452 kubelet[2711]: I0115 13:45:05.877871 2711 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 15 13:45:05.881452 kubelet[2711]: I0115 13:45:05.880300 2711 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 15 13:45:05.881452 kubelet[2711]: I0115 13:45:05.880519 2711 reconciler.go:26] "Reconciler: start to sync state" Jan 15 13:45:05.885463 kubelet[2711]: I0115 13:45:05.883715 2711 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 13:45:05.888374 kubelet[2711]: I0115 13:45:05.887481 2711 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 13:45:05.888374 kubelet[2711]: I0115 13:45:05.887541 2711 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 15 13:45:05.888374 kubelet[2711]: I0115 13:45:05.887568 2711 kubelet.go:2337] "Starting kubelet main sync loop" Jan 15 13:45:05.888374 kubelet[2711]: E0115 13:45:05.887623 2711 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 13:45:05.901913 kubelet[2711]: I0115 13:45:05.901872 2711 factory.go:221] Registration of the containerd container factory successfully Jan 15 13:45:05.902251 kubelet[2711]: I0115 13:45:05.902232 2711 factory.go:221] Registration of the systemd container factory successfully Jan 15 13:45:05.902478 kubelet[2711]: I0115 13:45:05.902431 2711 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 13:45:05.918645 kubelet[2711]: E0115 13:45:05.918601 2711 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 13:45:05.988976 kubelet[2711]: E0115 13:45:05.988919 2711 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 15 13:45:05.991525 kubelet[2711]: I0115 13:45:05.991083 2711 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 15 13:45:05.991525 kubelet[2711]: I0115 13:45:05.991108 2711 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 15 13:45:05.991525 kubelet[2711]: I0115 13:45:05.991162 2711 state_mem.go:36] "Initialized new in-memory state store" Jan 15 13:45:05.991696 kubelet[2711]: I0115 13:45:05.991647 2711 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 15 13:45:05.991696 kubelet[2711]: I0115 13:45:05.991670 2711 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 15 13:45:05.991845 kubelet[2711]: I0115 13:45:05.991731 2711 policy_none.go:49] "None policy: Start" Jan 15 13:45:05.992133 kubelet[2711]: I0115 13:45:05.992108 2711 kubelet_node_status.go:73] "Attempting to register node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:05.993237 kubelet[2711]: I0115 13:45:05.993204 2711 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 15 13:45:05.993368 kubelet[2711]: I0115 13:45:05.993348 2711 state_mem.go:35] "Initializing new in-memory state store" Jan 15 13:45:05.993740 kubelet[2711]: I0115 13:45:05.993686 2711 state_mem.go:75] "Updated machine memory state" Jan 15 13:45:06.007088 kubelet[2711]: I0115 13:45:06.004970 2711 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 13:45:06.007088 kubelet[2711]: I0115 13:45:06.005282 2711 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 13:45:06.007088 kubelet[2711]: I0115 13:45:06.005457 2711 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 13:45:06.010600 kubelet[2711]: I0115 13:45:06.010572 2711 kubelet_node_status.go:112] "Node was previously registered" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.012559 kubelet[2711]: I0115 13:45:06.012024 2711 kubelet_node_status.go:76] "Successfully registered node" node="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.189201 kubelet[2711]: I0115 13:45:06.189113 2711 topology_manager.go:215] "Topology Admit Handler" podUID="4b307409f158a6b009e337a8e6dda4f8" podNamespace="kube-system" podName="kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.189659 kubelet[2711]: I0115 13:45:06.189628 2711 topology_manager.go:215] "Topology Admit Handler" podUID="e41e3550d165c0102abe7a0d076c72f4" podNamespace="kube-system" podName="kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.189895 kubelet[2711]: I0115 13:45:06.189868 2711 topology_manager.go:215] "Topology Admit Handler" podUID="b9cc2cf83c8bf4eb107c225fbbd2a636" podNamespace="kube-system" podName="kube-scheduler-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.200962 kubelet[2711]: W0115 13:45:06.200416 2711 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 15 13:45:06.200962 kubelet[2711]: W0115 13:45:06.200972 2711 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 15 13:45:06.202714 kubelet[2711]: W0115 13:45:06.202475 2711 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 15 13:45:06.202714 kubelet[2711]: E0115 13:45:06.202552 2711 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.284529 kubelet[2711]: I0115 13:45:06.284335 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b307409f158a6b009e337a8e6dda4f8-k8s-certs\") pod \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" (UID: \"4b307409f158a6b009e337a8e6dda4f8\") " pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.284529 kubelet[2711]: I0115 13:45:06.284401 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b307409f158a6b009e337a8e6dda4f8-usr-share-ca-certificates\") pod \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" (UID: \"4b307409f158a6b009e337a8e6dda4f8\") " pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.284529 kubelet[2711]: I0115 13:45:06.284458 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-flexvolume-dir\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.284529 kubelet[2711]: I0115 13:45:06.284490 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-k8s-certs\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.285797 kubelet[2711]: I0115 13:45:06.284523 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.285797 kubelet[2711]: I0115 13:45:06.284577 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b307409f158a6b009e337a8e6dda4f8-ca-certs\") pod \"kube-apiserver-srv-6yg2e.gb1.brightbox.com\" (UID: \"4b307409f158a6b009e337a8e6dda4f8\") " pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.285797 kubelet[2711]: I0115 13:45:06.284607 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-ca-certs\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.285797 kubelet[2711]: I0115 13:45:06.284633 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e41e3550d165c0102abe7a0d076c72f4-kubeconfig\") pod \"kube-controller-manager-srv-6yg2e.gb1.brightbox.com\" (UID: \"e41e3550d165c0102abe7a0d076c72f4\") " pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.285797 kubelet[2711]: I0115 13:45:06.284659 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b9cc2cf83c8bf4eb107c225fbbd2a636-kubeconfig\") pod \"kube-scheduler-srv-6yg2e.gb1.brightbox.com\" (UID: \"b9cc2cf83c8bf4eb107c225fbbd2a636\") " pod="kube-system/kube-scheduler-srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:06.852278 kubelet[2711]: I0115 13:45:06.852214 2711 apiserver.go:52] "Watching apiserver" Jan 15 13:45:06.881518 kubelet[2711]: I0115 13:45:06.881322 2711 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 15 13:45:07.018036 kubelet[2711]: I0115 13:45:07.017947 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-6yg2e.gb1.brightbox.com" podStartSLOduration=1.017906559 podStartE2EDuration="1.017906559s" podCreationTimestamp="2025-01-15 13:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:45:07.002100067 +0000 UTC m=+1.254778157" watchObservedRunningTime="2025-01-15 13:45:07.017906559 +0000 UTC m=+1.270584645" Jan 15 13:45:07.037981 kubelet[2711]: I0115 13:45:07.037906 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-6yg2e.gb1.brightbox.com" podStartSLOduration=1.037885561 podStartE2EDuration="1.037885561s" podCreationTimestamp="2025-01-15 13:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:45:07.018431768 +0000 UTC m=+1.271109859" watchObservedRunningTime="2025-01-15 13:45:07.037885561 +0000 UTC m=+1.290563646" Jan 15 13:45:07.050728 kubelet[2711]: I0115 13:45:07.050409 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-6yg2e.gb1.brightbox.com" podStartSLOduration=2.050383504 podStartE2EDuration="2.050383504s" podCreationTimestamp="2025-01-15 13:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:45:07.038941622 +0000 UTC m=+1.291619713" watchObservedRunningTime="2025-01-15 13:45:07.050383504 +0000 UTC m=+1.303061590" Jan 15 13:45:12.037559 sudo[1756]: pam_unix(sudo:session): session closed for user root Jan 15 13:45:12.185106 sshd[1753]: pam_unix(sshd:session): session closed for user core Jan 15 13:45:12.190696 systemd[1]: sshd@6-10.230.66.218:22-147.75.109.163:41672.service: Deactivated successfully. Jan 15 13:45:12.194019 systemd[1]: session-9.scope: Deactivated successfully. Jan 15 13:45:12.194381 systemd[1]: session-9.scope: Consumed 6.728s CPU time, 187.3M memory peak, 0B memory swap peak. Jan 15 13:45:12.197226 systemd-logind[1493]: Session 9 logged out. Waiting for processes to exit. Jan 15 13:45:12.198948 systemd-logind[1493]: Removed session 9. Jan 15 13:45:18.331719 kubelet[2711]: I0115 13:45:18.331634 2711 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 15 13:45:18.333233 containerd[1513]: time="2025-01-15T13:45:18.332730959Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 15 13:45:18.334957 kubelet[2711]: I0115 13:45:18.333010 2711 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 15 13:45:19.227248 kubelet[2711]: I0115 13:45:19.225388 2711 topology_manager.go:215] "Topology Admit Handler" podUID="418c153a-ba83-4cc0-bae0-b993b0c9fc2b" podNamespace="kube-system" podName="kube-proxy-h2vmr" Jan 15 13:45:19.243803 systemd[1]: Created slice kubepods-besteffort-pod418c153a_ba83_4cc0_bae0_b993b0c9fc2b.slice - libcontainer container kubepods-besteffort-pod418c153a_ba83_4cc0_bae0_b993b0c9fc2b.slice. Jan 15 13:45:19.368995 kubelet[2711]: I0115 13:45:19.368923 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/418c153a-ba83-4cc0-bae0-b993b0c9fc2b-kube-proxy\") pod \"kube-proxy-h2vmr\" (UID: \"418c153a-ba83-4cc0-bae0-b993b0c9fc2b\") " pod="kube-system/kube-proxy-h2vmr" Jan 15 13:45:19.369770 kubelet[2711]: I0115 13:45:19.369616 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/418c153a-ba83-4cc0-bae0-b993b0c9fc2b-xtables-lock\") pod \"kube-proxy-h2vmr\" (UID: \"418c153a-ba83-4cc0-bae0-b993b0c9fc2b\") " pod="kube-system/kube-proxy-h2vmr" Jan 15 13:45:19.369770 kubelet[2711]: I0115 13:45:19.369661 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/418c153a-ba83-4cc0-bae0-b993b0c9fc2b-lib-modules\") pod \"kube-proxy-h2vmr\" (UID: \"418c153a-ba83-4cc0-bae0-b993b0c9fc2b\") " pod="kube-system/kube-proxy-h2vmr" Jan 15 13:45:19.369770 kubelet[2711]: I0115 13:45:19.369703 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-958pr\" (UniqueName: \"kubernetes.io/projected/418c153a-ba83-4cc0-bae0-b993b0c9fc2b-kube-api-access-958pr\") pod \"kube-proxy-h2vmr\" (UID: \"418c153a-ba83-4cc0-bae0-b993b0c9fc2b\") " pod="kube-system/kube-proxy-h2vmr" Jan 15 13:45:19.455980 kubelet[2711]: I0115 13:45:19.455903 2711 topology_manager.go:215] "Topology Admit Handler" podUID="d62aa4cf-3266-4374-bc54-6fbb7afce069" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-gbb5l" Jan 15 13:45:19.471470 kubelet[2711]: I0115 13:45:19.469930 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxz79\" (UniqueName: \"kubernetes.io/projected/d62aa4cf-3266-4374-bc54-6fbb7afce069-kube-api-access-mxz79\") pod \"tigera-operator-7bc55997bb-gbb5l\" (UID: \"d62aa4cf-3266-4374-bc54-6fbb7afce069\") " pod="tigera-operator/tigera-operator-7bc55997bb-gbb5l" Jan 15 13:45:19.471470 kubelet[2711]: I0115 13:45:19.470031 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d62aa4cf-3266-4374-bc54-6fbb7afce069-var-lib-calico\") pod \"tigera-operator-7bc55997bb-gbb5l\" (UID: \"d62aa4cf-3266-4374-bc54-6fbb7afce069\") " pod="tigera-operator/tigera-operator-7bc55997bb-gbb5l" Jan 15 13:45:19.479584 systemd[1]: Created slice kubepods-besteffort-podd62aa4cf_3266_4374_bc54_6fbb7afce069.slice - libcontainer container kubepods-besteffort-podd62aa4cf_3266_4374_bc54_6fbb7afce069.slice. Jan 15 13:45:19.553509 containerd[1513]: time="2025-01-15T13:45:19.553422371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h2vmr,Uid:418c153a-ba83-4cc0-bae0-b993b0c9fc2b,Namespace:kube-system,Attempt:0,}" Jan 15 13:45:19.601498 containerd[1513]: time="2025-01-15T13:45:19.601230596Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:19.602793 containerd[1513]: time="2025-01-15T13:45:19.601411657Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:19.602793 containerd[1513]: time="2025-01-15T13:45:19.601458547Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:19.602793 containerd[1513]: time="2025-01-15T13:45:19.601599341Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:19.641806 systemd[1]: Started cri-containerd-0ab071f8fb5c3258f920a27ffc8da7321ce22be0230ccf4bff2c8b5709cf0d05.scope - libcontainer container 0ab071f8fb5c3258f920a27ffc8da7321ce22be0230ccf4bff2c8b5709cf0d05. Jan 15 13:45:19.681059 containerd[1513]: time="2025-01-15T13:45:19.680918829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h2vmr,Uid:418c153a-ba83-4cc0-bae0-b993b0c9fc2b,Namespace:kube-system,Attempt:0,} returns sandbox id \"0ab071f8fb5c3258f920a27ffc8da7321ce22be0230ccf4bff2c8b5709cf0d05\"" Jan 15 13:45:19.692595 containerd[1513]: time="2025-01-15T13:45:19.692377236Z" level=info msg="CreateContainer within sandbox \"0ab071f8fb5c3258f920a27ffc8da7321ce22be0230ccf4bff2c8b5709cf0d05\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 15 13:45:19.715774 containerd[1513]: time="2025-01-15T13:45:19.715546141Z" level=info msg="CreateContainer within sandbox \"0ab071f8fb5c3258f920a27ffc8da7321ce22be0230ccf4bff2c8b5709cf0d05\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"572d38627687d46e0767a62b504c11b6abba6b62e51e8db7969c8c480a21d2cd\"" Jan 15 13:45:19.716709 containerd[1513]: time="2025-01-15T13:45:19.716647642Z" level=info msg="StartContainer for \"572d38627687d46e0767a62b504c11b6abba6b62e51e8db7969c8c480a21d2cd\"" Jan 15 13:45:19.760787 systemd[1]: Started cri-containerd-572d38627687d46e0767a62b504c11b6abba6b62e51e8db7969c8c480a21d2cd.scope - libcontainer container 572d38627687d46e0767a62b504c11b6abba6b62e51e8db7969c8c480a21d2cd. Jan 15 13:45:19.787349 containerd[1513]: time="2025-01-15T13:45:19.787161176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-gbb5l,Uid:d62aa4cf-3266-4374-bc54-6fbb7afce069,Namespace:tigera-operator,Attempt:0,}" Jan 15 13:45:19.826290 containerd[1513]: time="2025-01-15T13:45:19.826233390Z" level=info msg="StartContainer for \"572d38627687d46e0767a62b504c11b6abba6b62e51e8db7969c8c480a21d2cd\" returns successfully" Jan 15 13:45:19.836312 containerd[1513]: time="2025-01-15T13:45:19.836106620Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:19.836312 containerd[1513]: time="2025-01-15T13:45:19.836197670Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:19.836312 containerd[1513]: time="2025-01-15T13:45:19.836214583Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:19.837484 containerd[1513]: time="2025-01-15T13:45:19.836944384Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:19.864229 systemd[1]: Started cri-containerd-90f1b5e01a1c64447158aecb182f2b5896b51314b0ebae0d5583f93c6fb171dc.scope - libcontainer container 90f1b5e01a1c64447158aecb182f2b5896b51314b0ebae0d5583f93c6fb171dc. Jan 15 13:45:19.935573 containerd[1513]: time="2025-01-15T13:45:19.935514775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-gbb5l,Uid:d62aa4cf-3266-4374-bc54-6fbb7afce069,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"90f1b5e01a1c64447158aecb182f2b5896b51314b0ebae0d5583f93c6fb171dc\"" Jan 15 13:45:19.948946 containerd[1513]: time="2025-01-15T13:45:19.948285423Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 15 13:45:20.058741 kubelet[2711]: I0115 13:45:20.057745 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-h2vmr" podStartSLOduration=1.057715712 podStartE2EDuration="1.057715712s" podCreationTimestamp="2025-01-15 13:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:45:20.056518397 +0000 UTC m=+14.309196487" watchObservedRunningTime="2025-01-15 13:45:20.057715712 +0000 UTC m=+14.310393798" Jan 15 13:45:24.350877 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1012192766.mount: Deactivated successfully. Jan 15 13:45:25.182263 containerd[1513]: time="2025-01-15T13:45:25.182197251Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:25.184164 containerd[1513]: time="2025-01-15T13:45:25.184072715Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764297" Jan 15 13:45:25.186842 containerd[1513]: time="2025-01-15T13:45:25.186796966Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:25.190071 containerd[1513]: time="2025-01-15T13:45:25.190011788Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:25.191765 containerd[1513]: time="2025-01-15T13:45:25.191716555Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 5.243374464s" Jan 15 13:45:25.191842 containerd[1513]: time="2025-01-15T13:45:25.191770480Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 15 13:45:25.202226 containerd[1513]: time="2025-01-15T13:45:25.201999777Z" level=info msg="CreateContainer within sandbox \"90f1b5e01a1c64447158aecb182f2b5896b51314b0ebae0d5583f93c6fb171dc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 15 13:45:25.226078 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount776863353.mount: Deactivated successfully. Jan 15 13:45:25.231180 containerd[1513]: time="2025-01-15T13:45:25.231129459Z" level=info msg="CreateContainer within sandbox \"90f1b5e01a1c64447158aecb182f2b5896b51314b0ebae0d5583f93c6fb171dc\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a4627944172efafbfdd8917ce33c284b41b5c29ca386b7efd42d8631921f0467\"" Jan 15 13:45:25.234300 containerd[1513]: time="2025-01-15T13:45:25.232519847Z" level=info msg="StartContainer for \"a4627944172efafbfdd8917ce33c284b41b5c29ca386b7efd42d8631921f0467\"" Jan 15 13:45:25.280673 systemd[1]: Started cri-containerd-a4627944172efafbfdd8917ce33c284b41b5c29ca386b7efd42d8631921f0467.scope - libcontainer container a4627944172efafbfdd8917ce33c284b41b5c29ca386b7efd42d8631921f0467. Jan 15 13:45:25.321748 containerd[1513]: time="2025-01-15T13:45:25.321693881Z" level=info msg="StartContainer for \"a4627944172efafbfdd8917ce33c284b41b5c29ca386b7efd42d8631921f0467\" returns successfully" Jan 15 13:45:28.531044 kubelet[2711]: I0115 13:45:28.530872 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-gbb5l" podStartSLOduration=4.277764354 podStartE2EDuration="9.529578626s" podCreationTimestamp="2025-01-15 13:45:19 +0000 UTC" firstStartedPulling="2025-01-15 13:45:19.94249583 +0000 UTC m=+14.195173914" lastFinishedPulling="2025-01-15 13:45:25.194310103 +0000 UTC m=+19.446988186" observedRunningTime="2025-01-15 13:45:26.026866894 +0000 UTC m=+20.279544991" watchObservedRunningTime="2025-01-15 13:45:28.529578626 +0000 UTC m=+22.782256713" Jan 15 13:45:28.540303 kubelet[2711]: I0115 13:45:28.540244 2711 topology_manager.go:215] "Topology Admit Handler" podUID="4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" podNamespace="calico-system" podName="calico-typha-6d8cf878c9-59dfn" Jan 15 13:45:28.581540 systemd[1]: Created slice kubepods-besteffort-pod4c7ae6ba_25db_42e4_a2dd_054903d9d6d2.slice - libcontainer container kubepods-besteffort-pod4c7ae6ba_25db_42e4_a2dd_054903d9d6d2.slice. Jan 15 13:45:28.679672 kubelet[2711]: I0115 13:45:28.679562 2711 topology_manager.go:215] "Topology Admit Handler" podUID="c87ee128-b80e-461f-b0cc-3aafd8d5be53" podNamespace="calico-system" podName="calico-node-5l5br" Jan 15 13:45:28.694587 systemd[1]: Created slice kubepods-besteffort-podc87ee128_b80e_461f_b0cc_3aafd8d5be53.slice - libcontainer container kubepods-besteffort-podc87ee128_b80e_461f_b0cc_3aafd8d5be53.slice. Jan 15 13:45:28.731854 kubelet[2711]: I0115 13:45:28.731800 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dftq\" (UniqueName: \"kubernetes.io/projected/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-kube-api-access-6dftq\") pod \"calico-typha-6d8cf878c9-59dfn\" (UID: \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\") " pod="calico-system/calico-typha-6d8cf878c9-59dfn" Jan 15 13:45:28.732020 kubelet[2711]: I0115 13:45:28.731863 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-tigera-ca-bundle\") pod \"calico-typha-6d8cf878c9-59dfn\" (UID: \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\") " pod="calico-system/calico-typha-6d8cf878c9-59dfn" Jan 15 13:45:28.732020 kubelet[2711]: I0115 13:45:28.731898 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-typha-certs\") pod \"calico-typha-6d8cf878c9-59dfn\" (UID: \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\") " pod="calico-system/calico-typha-6d8cf878c9-59dfn" Jan 15 13:45:28.834478 kubelet[2711]: I0115 13:45:28.833056 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-net-dir\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834478 kubelet[2711]: I0115 13:45:28.833176 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-lib-calico\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834478 kubelet[2711]: I0115 13:45:28.833212 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-bin-dir\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834478 kubelet[2711]: I0115 13:45:28.833239 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-log-dir\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834478 kubelet[2711]: I0115 13:45:28.833271 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5zz\" (UniqueName: \"kubernetes.io/projected/c87ee128-b80e-461f-b0cc-3aafd8d5be53-kube-api-access-tg5zz\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834827 kubelet[2711]: I0115 13:45:28.833301 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-lib-modules\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834827 kubelet[2711]: I0115 13:45:28.833345 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c87ee128-b80e-461f-b0cc-3aafd8d5be53-tigera-ca-bundle\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834827 kubelet[2711]: I0115 13:45:28.833398 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-flexvol-driver-host\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834827 kubelet[2711]: I0115 13:45:28.833488 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-xtables-lock\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.834827 kubelet[2711]: I0115 13:45:28.833549 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-policysync\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.835047 kubelet[2711]: I0115 13:45:28.833577 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c87ee128-b80e-461f-b0cc-3aafd8d5be53-node-certs\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.835047 kubelet[2711]: I0115 13:45:28.833628 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-run-calico\") pod \"calico-node-5l5br\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " pod="calico-system/calico-node-5l5br" Jan 15 13:45:28.867610 kubelet[2711]: I0115 13:45:28.866076 2711 topology_manager.go:215] "Topology Admit Handler" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" podNamespace="calico-system" podName="csi-node-driver-zz4wr" Jan 15 13:45:28.867610 kubelet[2711]: E0115 13:45:28.866528 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:28.890377 containerd[1513]: time="2025-01-15T13:45:28.889818746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6d8cf878c9-59dfn,Uid:4c7ae6ba-25db-42e4-a2dd-054903d9d6d2,Namespace:calico-system,Attempt:0,}" Jan 15 13:45:28.955108 kubelet[2711]: E0115 13:45:28.954558 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:28.955108 kubelet[2711]: W0115 13:45:28.954608 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:28.955108 kubelet[2711]: E0115 13:45:28.954711 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:28.968372 containerd[1513]: time="2025-01-15T13:45:28.965947992Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:28.968372 containerd[1513]: time="2025-01-15T13:45:28.967745499Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:28.968372 containerd[1513]: time="2025-01-15T13:45:28.967818533Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:28.968372 containerd[1513]: time="2025-01-15T13:45:28.968224114Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:28.990194 kubelet[2711]: E0115 13:45:28.987395 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:28.990194 kubelet[2711]: W0115 13:45:28.987632 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:28.990194 kubelet[2711]: E0115 13:45:28.987676 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.005270 containerd[1513]: time="2025-01-15T13:45:29.005207240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5l5br,Uid:c87ee128-b80e-461f-b0cc-3aafd8d5be53,Namespace:calico-system,Attempt:0,}" Jan 15 13:45:29.017663 systemd[1]: Started cri-containerd-086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99.scope - libcontainer container 086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99. Jan 15 13:45:29.039144 kubelet[2711]: E0115 13:45:29.037694 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.039144 kubelet[2711]: W0115 13:45:29.037750 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.039144 kubelet[2711]: E0115 13:45:29.037785 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.039144 kubelet[2711]: I0115 13:45:29.037863 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zhzc\" (UniqueName: \"kubernetes.io/projected/9dd88286-18b7-4e6e-a5d3-8c847dab96ba-kube-api-access-6zhzc\") pod \"csi-node-driver-zz4wr\" (UID: \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\") " pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:29.039144 kubelet[2711]: E0115 13:45:29.038343 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.039144 kubelet[2711]: W0115 13:45:29.038361 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.039144 kubelet[2711]: E0115 13:45:29.038390 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.040004 kubelet[2711]: E0115 13:45:29.039231 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.040004 kubelet[2711]: W0115 13:45:29.039248 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.040004 kubelet[2711]: E0115 13:45:29.039314 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.040252 kubelet[2711]: E0115 13:45:29.040125 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.040252 kubelet[2711]: W0115 13:45:29.040145 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.040252 kubelet[2711]: E0115 13:45:29.040176 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.040252 kubelet[2711]: I0115 13:45:29.040209 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9dd88286-18b7-4e6e-a5d3-8c847dab96ba-kubelet-dir\") pod \"csi-node-driver-zz4wr\" (UID: \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\") " pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:29.041090 kubelet[2711]: E0115 13:45:29.041052 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.041090 kubelet[2711]: W0115 13:45:29.041079 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.041201 kubelet[2711]: E0115 13:45:29.041117 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.041201 kubelet[2711]: I0115 13:45:29.041147 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/9dd88286-18b7-4e6e-a5d3-8c847dab96ba-varrun\") pod \"csi-node-driver-zz4wr\" (UID: \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\") " pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:29.042007 kubelet[2711]: E0115 13:45:29.041927 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.042007 kubelet[2711]: W0115 13:45:29.041950 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.042007 kubelet[2711]: E0115 13:45:29.041974 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.042823 kubelet[2711]: E0115 13:45:29.042790 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.042823 kubelet[2711]: W0115 13:45:29.042814 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.044051 kubelet[2711]: E0115 13:45:29.042849 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.044051 kubelet[2711]: E0115 13:45:29.043162 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.044051 kubelet[2711]: W0115 13:45:29.043176 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.044051 kubelet[2711]: E0115 13:45:29.043198 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.044051 kubelet[2711]: I0115 13:45:29.043243 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9dd88286-18b7-4e6e-a5d3-8c847dab96ba-socket-dir\") pod \"csi-node-driver-zz4wr\" (UID: \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\") " pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:29.045034 kubelet[2711]: E0115 13:45:29.045005 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.045034 kubelet[2711]: W0115 13:45:29.045031 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.045704 kubelet[2711]: E0115 13:45:29.045130 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.045704 kubelet[2711]: I0115 13:45:29.045163 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9dd88286-18b7-4e6e-a5d3-8c847dab96ba-registration-dir\") pod \"csi-node-driver-zz4wr\" (UID: \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\") " pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:29.045704 kubelet[2711]: E0115 13:45:29.045494 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.045704 kubelet[2711]: W0115 13:45:29.045509 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.045704 kubelet[2711]: E0115 13:45:29.045646 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.046212 kubelet[2711]: E0115 13:45:29.045883 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.046212 kubelet[2711]: W0115 13:45:29.045898 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.046212 kubelet[2711]: E0115 13:45:29.045935 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.046679 kubelet[2711]: E0115 13:45:29.046617 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.046679 kubelet[2711]: W0115 13:45:29.046639 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.046990 kubelet[2711]: E0115 13:45:29.046776 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.047314 kubelet[2711]: E0115 13:45:29.047240 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.047314 kubelet[2711]: W0115 13:45:29.047263 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.047314 kubelet[2711]: E0115 13:45:29.047279 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.048014 kubelet[2711]: E0115 13:45:29.047901 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.048014 kubelet[2711]: W0115 13:45:29.047916 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.048014 kubelet[2711]: E0115 13:45:29.047932 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.049259 kubelet[2711]: E0115 13:45:29.048824 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.049259 kubelet[2711]: W0115 13:45:29.048847 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.049259 kubelet[2711]: E0115 13:45:29.048864 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.087481 containerd[1513]: time="2025-01-15T13:45:29.084523760Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:29.087770 containerd[1513]: time="2025-01-15T13:45:29.086050637Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:29.087770 containerd[1513]: time="2025-01-15T13:45:29.086082048Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:29.087770 containerd[1513]: time="2025-01-15T13:45:29.086272378Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:29.126756 systemd[1]: Started cri-containerd-f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19.scope - libcontainer container f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19. Jan 15 13:45:29.147663 kubelet[2711]: E0115 13:45:29.147617 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.147663 kubelet[2711]: W0115 13:45:29.147653 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.147960 kubelet[2711]: E0115 13:45:29.147687 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.148148 kubelet[2711]: E0115 13:45:29.148117 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.148148 kubelet[2711]: W0115 13:45:29.148139 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.148288 kubelet[2711]: E0115 13:45:29.148174 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.148624 kubelet[2711]: E0115 13:45:29.148578 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.148624 kubelet[2711]: W0115 13:45:29.148608 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.149216 kubelet[2711]: E0115 13:45:29.149181 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.149676 kubelet[2711]: E0115 13:45:29.149644 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.149676 kubelet[2711]: W0115 13:45:29.149667 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.149806 kubelet[2711]: E0115 13:45:29.149704 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.150140 kubelet[2711]: E0115 13:45:29.150032 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.150140 kubelet[2711]: W0115 13:45:29.150054 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.150553 kubelet[2711]: E0115 13:45:29.150148 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.150553 kubelet[2711]: E0115 13:45:29.150374 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.150553 kubelet[2711]: W0115 13:45:29.150388 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.150553 kubelet[2711]: E0115 13:45:29.150463 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.151172 kubelet[2711]: E0115 13:45:29.150765 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.151172 kubelet[2711]: W0115 13:45:29.150781 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.151172 kubelet[2711]: E0115 13:45:29.150872 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.152683 kubelet[2711]: E0115 13:45:29.152658 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.152683 kubelet[2711]: W0115 13:45:29.152680 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.153558 kubelet[2711]: E0115 13:45:29.152819 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.153785 kubelet[2711]: E0115 13:45:29.153760 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.153785 kubelet[2711]: W0115 13:45:29.153782 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.153943 kubelet[2711]: E0115 13:45:29.153918 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.154465 kubelet[2711]: E0115 13:45:29.154189 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.154465 kubelet[2711]: W0115 13:45:29.154203 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.154665 kubelet[2711]: E0115 13:45:29.154498 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.154665 kubelet[2711]: W0115 13:45:29.154512 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.154774 kubelet[2711]: E0115 13:45:29.154681 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.154774 kubelet[2711]: E0115 13:45:29.154701 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.155656 kubelet[2711]: E0115 13:45:29.155078 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.155656 kubelet[2711]: W0115 13:45:29.155098 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.155656 kubelet[2711]: E0115 13:45:29.155204 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.155656 kubelet[2711]: E0115 13:45:29.155362 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.155656 kubelet[2711]: W0115 13:45:29.155375 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.155656 kubelet[2711]: E0115 13:45:29.155472 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.160351 kubelet[2711]: E0115 13:45:29.159669 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.160351 kubelet[2711]: W0115 13:45:29.159691 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.160351 kubelet[2711]: E0115 13:45:29.159718 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.161302 kubelet[2711]: E0115 13:45:29.161262 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.161398 kubelet[2711]: W0115 13:45:29.161333 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.161398 kubelet[2711]: E0115 13:45:29.161356 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.162815 kubelet[2711]: E0115 13:45:29.162791 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.162907 kubelet[2711]: W0115 13:45:29.162816 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.163005 kubelet[2711]: E0115 13:45:29.162911 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.163460 kubelet[2711]: E0115 13:45:29.163406 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.163460 kubelet[2711]: W0115 13:45:29.163428 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.163582 kubelet[2711]: E0115 13:45:29.163559 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.163835 kubelet[2711]: E0115 13:45:29.163810 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.163835 kubelet[2711]: W0115 13:45:29.163831 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.163938 kubelet[2711]: E0115 13:45:29.163925 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.164167 kubelet[2711]: E0115 13:45:29.164141 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.164167 kubelet[2711]: W0115 13:45:29.164163 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.164267 kubelet[2711]: E0115 13:45:29.164256 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.165270 kubelet[2711]: E0115 13:45:29.165244 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.165270 kubelet[2711]: W0115 13:45:29.165266 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.165409 kubelet[2711]: E0115 13:45:29.165300 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.165662 kubelet[2711]: E0115 13:45:29.165638 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.165662 kubelet[2711]: W0115 13:45:29.165659 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.166458 kubelet[2711]: E0115 13:45:29.165787 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.166458 kubelet[2711]: E0115 13:45:29.165967 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.166458 kubelet[2711]: W0115 13:45:29.165981 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.166654 kubelet[2711]: E0115 13:45:29.166628 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.166861 kubelet[2711]: E0115 13:45:29.166835 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.166861 kubelet[2711]: W0115 13:45:29.166857 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.167528 kubelet[2711]: E0115 13:45:29.166991 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.167528 kubelet[2711]: E0115 13:45:29.167162 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.167528 kubelet[2711]: W0115 13:45:29.167176 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.167695 kubelet[2711]: E0115 13:45:29.167568 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.167822 kubelet[2711]: E0115 13:45:29.167771 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.167822 kubelet[2711]: W0115 13:45:29.167793 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.167822 kubelet[2711]: E0115 13:45:29.167811 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.187391 kubelet[2711]: E0115 13:45:29.187173 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:29.187391 kubelet[2711]: W0115 13:45:29.187304 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:29.187391 kubelet[2711]: E0115 13:45:29.187337 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:29.226904 containerd[1513]: time="2025-01-15T13:45:29.226730911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6d8cf878c9-59dfn,Uid:4c7ae6ba-25db-42e4-a2dd-054903d9d6d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\"" Jan 15 13:45:29.233914 containerd[1513]: time="2025-01-15T13:45:29.233689695Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 15 13:45:29.240767 containerd[1513]: time="2025-01-15T13:45:29.240701062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5l5br,Uid:c87ee128-b80e-461f-b0cc-3aafd8d5be53,Namespace:calico-system,Attempt:0,} returns sandbox id \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\"" Jan 15 13:45:30.891024 kubelet[2711]: E0115 13:45:30.888379 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:30.935927 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2832145770.mount: Deactivated successfully. Jan 15 13:45:32.266863 containerd[1513]: time="2025-01-15T13:45:32.266783448Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:32.268251 containerd[1513]: time="2025-01-15T13:45:32.268153707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 15 13:45:32.269939 containerd[1513]: time="2025-01-15T13:45:32.269399957Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:32.294256 containerd[1513]: time="2025-01-15T13:45:32.294198199Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:32.295505 containerd[1513]: time="2025-01-15T13:45:32.295407721Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 3.061664317s" Jan 15 13:45:32.295505 containerd[1513]: time="2025-01-15T13:45:32.295476384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 15 13:45:32.298764 containerd[1513]: time="2025-01-15T13:45:32.298733911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 15 13:45:32.315403 containerd[1513]: time="2025-01-15T13:45:32.315343573Z" level=info msg="CreateContainer within sandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 15 13:45:32.336845 containerd[1513]: time="2025-01-15T13:45:32.336749945Z" level=info msg="CreateContainer within sandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\"" Jan 15 13:45:32.338101 containerd[1513]: time="2025-01-15T13:45:32.337903621Z" level=info msg="StartContainer for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\"" Jan 15 13:45:32.422763 systemd[1]: Started cri-containerd-c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935.scope - libcontainer container c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935. Jan 15 13:45:32.501496 containerd[1513]: time="2025-01-15T13:45:32.500866708Z" level=info msg="StartContainer for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" returns successfully" Jan 15 13:45:32.889190 kubelet[2711]: E0115 13:45:32.889073 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:33.057786 kubelet[2711]: I0115 13:45:33.057489 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6d8cf878c9-59dfn" podStartSLOduration=1.991339352 podStartE2EDuration="5.057458077s" podCreationTimestamp="2025-01-15 13:45:28 +0000 UTC" firstStartedPulling="2025-01-15 13:45:29.231478317 +0000 UTC m=+23.484156397" lastFinishedPulling="2025-01-15 13:45:32.29759703 +0000 UTC m=+26.550275122" observedRunningTime="2025-01-15 13:45:33.055492027 +0000 UTC m=+27.308170118" watchObservedRunningTime="2025-01-15 13:45:33.057458077 +0000 UTC m=+27.310136163" Jan 15 13:45:33.072866 kubelet[2711]: E0115 13:45:33.072747 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.072866 kubelet[2711]: W0115 13:45:33.072786 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.072866 kubelet[2711]: E0115 13:45:33.072815 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.073843 kubelet[2711]: E0115 13:45:33.073098 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.073843 kubelet[2711]: W0115 13:45:33.073112 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.073843 kubelet[2711]: E0115 13:45:33.073131 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.073843 kubelet[2711]: E0115 13:45:33.073400 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.073843 kubelet[2711]: W0115 13:45:33.073414 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.073843 kubelet[2711]: E0115 13:45:33.073429 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.073843 kubelet[2711]: E0115 13:45:33.073702 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.073843 kubelet[2711]: W0115 13:45:33.073716 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.073843 kubelet[2711]: E0115 13:45:33.073730 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.074664 kubelet[2711]: E0115 13:45:33.073987 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.074664 kubelet[2711]: W0115 13:45:33.074010 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.074664 kubelet[2711]: E0115 13:45:33.074025 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.074664 kubelet[2711]: E0115 13:45:33.074289 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.074664 kubelet[2711]: W0115 13:45:33.074301 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.074664 kubelet[2711]: E0115 13:45:33.074316 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.075152 kubelet[2711]: E0115 13:45:33.074668 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.075152 kubelet[2711]: W0115 13:45:33.074683 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.075152 kubelet[2711]: E0115 13:45:33.074698 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.075152 kubelet[2711]: E0115 13:45:33.074942 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.075152 kubelet[2711]: W0115 13:45:33.074958 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.075152 kubelet[2711]: E0115 13:45:33.074973 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.075755 kubelet[2711]: E0115 13:45:33.075223 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.075755 kubelet[2711]: W0115 13:45:33.075237 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.075755 kubelet[2711]: E0115 13:45:33.075251 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.075755 kubelet[2711]: E0115 13:45:33.075558 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.075755 kubelet[2711]: W0115 13:45:33.075573 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.075755 kubelet[2711]: E0115 13:45:33.075595 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.076143 kubelet[2711]: E0115 13:45:33.075946 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.076143 kubelet[2711]: W0115 13:45:33.075960 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.076143 kubelet[2711]: E0115 13:45:33.075975 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.076304 kubelet[2711]: E0115 13:45:33.076263 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.076304 kubelet[2711]: W0115 13:45:33.076300 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.076459 kubelet[2711]: E0115 13:45:33.076317 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.076649 kubelet[2711]: E0115 13:45:33.076630 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.076649 kubelet[2711]: W0115 13:45:33.076649 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.076771 kubelet[2711]: E0115 13:45:33.076666 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.076912 kubelet[2711]: E0115 13:45:33.076891 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.076912 kubelet[2711]: W0115 13:45:33.076909 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.077015 kubelet[2711]: E0115 13:45:33.076925 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.077241 kubelet[2711]: E0115 13:45:33.077208 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.077241 kubelet[2711]: W0115 13:45:33.077227 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.077352 kubelet[2711]: E0115 13:45:33.077250 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.083270 kubelet[2711]: E0115 13:45:33.083233 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.083270 kubelet[2711]: W0115 13:45:33.083262 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.084059 kubelet[2711]: E0115 13:45:33.083282 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.084059 kubelet[2711]: E0115 13:45:33.083628 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.084059 kubelet[2711]: W0115 13:45:33.083642 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.084059 kubelet[2711]: E0115 13:45:33.083666 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.084648 kubelet[2711]: E0115 13:45:33.084545 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.084648 kubelet[2711]: W0115 13:45:33.084631 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.084847 kubelet[2711]: E0115 13:45:33.084692 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.085160 kubelet[2711]: E0115 13:45:33.085136 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.085160 kubelet[2711]: W0115 13:45:33.085157 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.085410 kubelet[2711]: E0115 13:45:33.085310 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.085485 kubelet[2711]: E0115 13:45:33.085472 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.085542 kubelet[2711]: W0115 13:45:33.085486 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.085941 kubelet[2711]: E0115 13:45:33.085648 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.086107 kubelet[2711]: E0115 13:45:33.086079 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.086183 kubelet[2711]: W0115 13:45:33.086128 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.086248 kubelet[2711]: E0115 13:45:33.086228 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.086907 kubelet[2711]: E0115 13:45:33.086875 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.086907 kubelet[2711]: W0115 13:45:33.086901 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.087031 kubelet[2711]: E0115 13:45:33.086926 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.088068 kubelet[2711]: E0115 13:45:33.087702 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.088068 kubelet[2711]: W0115 13:45:33.087723 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.088068 kubelet[2711]: E0115 13:45:33.087957 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.088068 kubelet[2711]: W0115 13:45:33.087972 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.089082 kubelet[2711]: E0115 13:45:33.088288 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.089082 kubelet[2711]: W0115 13:45:33.088302 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.089082 kubelet[2711]: E0115 13:45:33.088318 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.089082 kubelet[2711]: E0115 13:45:33.088578 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.089082 kubelet[2711]: W0115 13:45:33.088592 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.089082 kubelet[2711]: E0115 13:45:33.088606 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.089082 kubelet[2711]: E0115 13:45:33.088849 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.089082 kubelet[2711]: W0115 13:45:33.088862 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.089082 kubelet[2711]: E0115 13:45:33.088877 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.089459 kubelet[2711]: E0115 13:45:33.089379 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.089459 kubelet[2711]: W0115 13:45:33.089393 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.089459 kubelet[2711]: E0115 13:45:33.089410 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.090472 kubelet[2711]: E0115 13:45:33.089470 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.090472 kubelet[2711]: E0115 13:45:33.089958 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.090472 kubelet[2711]: W0115 13:45:33.089971 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.090472 kubelet[2711]: E0115 13:45:33.089986 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.090472 kubelet[2711]: E0115 13:45:33.090009 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.090472 kubelet[2711]: E0115 13:45:33.090237 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.090472 kubelet[2711]: W0115 13:45:33.090250 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.090472 kubelet[2711]: E0115 13:45:33.090264 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.090823 kubelet[2711]: E0115 13:45:33.090493 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.090823 kubelet[2711]: W0115 13:45:33.090506 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.090823 kubelet[2711]: E0115 13:45:33.090521 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.090823 kubelet[2711]: E0115 13:45:33.090757 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.090823 kubelet[2711]: W0115 13:45:33.090770 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.090823 kubelet[2711]: E0115 13:45:33.090784 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:33.091864 kubelet[2711]: E0115 13:45:33.091658 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:33.091864 kubelet[2711]: W0115 13:45:33.091679 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:33.091864 kubelet[2711]: E0115 13:45:33.091695 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.042828 kubelet[2711]: I0115 13:45:34.042696 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 13:45:34.084324 kubelet[2711]: E0115 13:45:34.084284 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.084710 kubelet[2711]: W0115 13:45:34.084552 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.084710 kubelet[2711]: E0115 13:45:34.084593 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.085137 kubelet[2711]: E0115 13:45:34.084926 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.085137 kubelet[2711]: W0115 13:45:34.084940 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.085137 kubelet[2711]: E0115 13:45:34.084961 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.085385 kubelet[2711]: E0115 13:45:34.085365 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.085668 kubelet[2711]: W0115 13:45:34.085506 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.085668 kubelet[2711]: E0115 13:45:34.085533 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.085897 kubelet[2711]: E0115 13:45:34.085877 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.086052 kubelet[2711]: W0115 13:45:34.086010 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.086281 kubelet[2711]: E0115 13:45:34.086146 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.086512 kubelet[2711]: E0115 13:45:34.086479 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.086754 kubelet[2711]: W0115 13:45:34.086615 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.086754 kubelet[2711]: E0115 13:45:34.086641 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.087010 kubelet[2711]: E0115 13:45:34.086944 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.087010 kubelet[2711]: W0115 13:45:34.086971 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.087287 kubelet[2711]: E0115 13:45:34.087168 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.087498 kubelet[2711]: E0115 13:45:34.087480 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.087746 kubelet[2711]: W0115 13:45:34.087607 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.087746 kubelet[2711]: E0115 13:45:34.087634 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.087962 kubelet[2711]: E0115 13:45:34.087932 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.088225 kubelet[2711]: W0115 13:45:34.088033 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.088225 kubelet[2711]: E0115 13:45:34.088055 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.088425 kubelet[2711]: E0115 13:45:34.088406 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.088729 kubelet[2711]: W0115 13:45:34.088590 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.088729 kubelet[2711]: E0115 13:45:34.088617 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.088954 kubelet[2711]: E0115 13:45:34.088930 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.089091 kubelet[2711]: W0115 13:45:34.089072 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.089233 kubelet[2711]: E0115 13:45:34.089213 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.089733 kubelet[2711]: E0115 13:45:34.089586 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.089733 kubelet[2711]: W0115 13:45:34.089604 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.089733 kubelet[2711]: E0115 13:45:34.089623 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.090037 kubelet[2711]: E0115 13:45:34.090017 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.090255 kubelet[2711]: W0115 13:45:34.090105 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.090255 kubelet[2711]: E0115 13:45:34.090130 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.090483 kubelet[2711]: E0115 13:45:34.090431 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.090630 kubelet[2711]: W0115 13:45:34.090609 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.090835 kubelet[2711]: E0115 13:45:34.090709 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.091005 kubelet[2711]: E0115 13:45:34.090986 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.091226 kubelet[2711]: W0115 13:45:34.091079 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.091226 kubelet[2711]: E0115 13:45:34.091103 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.091430 kubelet[2711]: E0115 13:45:34.091411 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.091590 kubelet[2711]: W0115 13:45:34.091568 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.091739 kubelet[2711]: E0115 13:45:34.091669 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.092761 kubelet[2711]: E0115 13:45:34.092738 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.092761 kubelet[2711]: W0115 13:45:34.092760 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.092912 kubelet[2711]: E0115 13:45:34.092779 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.093263 kubelet[2711]: E0115 13:45:34.093099 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.093263 kubelet[2711]: W0115 13:45:34.093148 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.093263 kubelet[2711]: E0115 13:45:34.093168 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.093612 kubelet[2711]: E0115 13:45:34.093590 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.093716 kubelet[2711]: W0115 13:45:34.093696 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.093921 kubelet[2711]: E0115 13:45:34.093792 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.094110 kubelet[2711]: E0115 13:45:34.094090 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.094230 kubelet[2711]: W0115 13:45:34.094204 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.094469 kubelet[2711]: E0115 13:45:34.094323 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.094793 kubelet[2711]: E0115 13:45:34.094657 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.094793 kubelet[2711]: W0115 13:45:34.094676 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.094793 kubelet[2711]: E0115 13:45:34.094725 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.095016 kubelet[2711]: E0115 13:45:34.094997 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.095127 kubelet[2711]: W0115 13:45:34.095098 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.095304 kubelet[2711]: E0115 13:45:34.095252 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.095676 kubelet[2711]: E0115 13:45:34.095657 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.095887 kubelet[2711]: W0115 13:45:34.095760 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.095887 kubelet[2711]: E0115 13:45:34.095803 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.096069 kubelet[2711]: E0115 13:45:34.096050 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.096176 kubelet[2711]: W0115 13:45:34.096156 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.096393 kubelet[2711]: E0115 13:45:34.096296 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.096669 kubelet[2711]: E0115 13:45:34.096649 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.096868 kubelet[2711]: W0115 13:45:34.096740 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.096868 kubelet[2711]: E0115 13:45:34.096765 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.097180 kubelet[2711]: E0115 13:45:34.097109 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.097180 kubelet[2711]: W0115 13:45:34.097128 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.097180 kubelet[2711]: E0115 13:45:34.097152 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.097579 kubelet[2711]: E0115 13:45:34.097528 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.097579 kubelet[2711]: W0115 13:45:34.097566 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.097712 kubelet[2711]: E0115 13:45:34.097622 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.097972 kubelet[2711]: E0115 13:45:34.097951 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.097972 kubelet[2711]: W0115 13:45:34.097971 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.098083 kubelet[2711]: E0115 13:45:34.098006 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.098324 kubelet[2711]: E0115 13:45:34.098302 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.098324 kubelet[2711]: W0115 13:45:34.098322 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.098414 kubelet[2711]: E0115 13:45:34.098346 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.100120 kubelet[2711]: E0115 13:45:34.099844 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.100120 kubelet[2711]: W0115 13:45:34.099874 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.100120 kubelet[2711]: E0115 13:45:34.099965 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.101101 kubelet[2711]: E0115 13:45:34.100521 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.101203 kubelet[2711]: W0115 13:45:34.101138 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.101203 kubelet[2711]: E0115 13:45:34.101178 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.102340 kubelet[2711]: E0115 13:45:34.101807 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.102340 kubelet[2711]: W0115 13:45:34.101869 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.102340 kubelet[2711]: E0115 13:45:34.101887 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.102582 kubelet[2711]: E0115 13:45:34.102560 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.102644 kubelet[2711]: W0115 13:45:34.102609 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.102693 kubelet[2711]: E0115 13:45:34.102643 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.103022 kubelet[2711]: E0115 13:45:34.102993 2711 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 13:45:34.103089 kubelet[2711]: W0115 13:45:34.103038 2711 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 13:45:34.103089 kubelet[2711]: E0115 13:45:34.103057 2711 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 13:45:34.129926 containerd[1513]: time="2025-01-15T13:45:34.129850502Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:34.131626 containerd[1513]: time="2025-01-15T13:45:34.131566072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 15 13:45:34.133783 containerd[1513]: time="2025-01-15T13:45:34.133590838Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:34.137301 containerd[1513]: time="2025-01-15T13:45:34.137198214Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:34.139339 containerd[1513]: time="2025-01-15T13:45:34.138482967Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.838349486s" Jan 15 13:45:34.139339 containerd[1513]: time="2025-01-15T13:45:34.138530350Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 15 13:45:34.144026 containerd[1513]: time="2025-01-15T13:45:34.143977016Z" level=info msg="CreateContainer within sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 15 13:45:34.164670 containerd[1513]: time="2025-01-15T13:45:34.163975893Z" level=info msg="CreateContainer within sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\"" Jan 15 13:45:34.165520 containerd[1513]: time="2025-01-15T13:45:34.165424516Z" level=info msg="StartContainer for \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\"" Jan 15 13:45:34.215724 systemd[1]: run-containerd-runc-k8s.io-f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d-runc.rehZiB.mount: Deactivated successfully. Jan 15 13:45:34.223655 systemd[1]: Started cri-containerd-f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d.scope - libcontainer container f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d. Jan 15 13:45:34.284887 containerd[1513]: time="2025-01-15T13:45:34.284798218Z" level=info msg="StartContainer for \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\" returns successfully" Jan 15 13:45:34.313615 systemd[1]: cri-containerd-f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d.scope: Deactivated successfully. Jan 15 13:45:34.529207 containerd[1513]: time="2025-01-15T13:45:34.503944610Z" level=info msg="shim disconnected" id=f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d namespace=k8s.io Jan 15 13:45:34.529560 containerd[1513]: time="2025-01-15T13:45:34.529222914Z" level=warning msg="cleaning up after shim disconnected" id=f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d namespace=k8s.io Jan 15 13:45:34.529560 containerd[1513]: time="2025-01-15T13:45:34.529257235Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:45:34.560759 containerd[1513]: time="2025-01-15T13:45:34.560675414Z" level=warning msg="cleanup warnings time=\"2025-01-15T13:45:34Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 15 13:45:34.888722 kubelet[2711]: E0115 13:45:34.888645 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:35.049626 containerd[1513]: time="2025-01-15T13:45:35.049578221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 15 13:45:35.160955 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d-rootfs.mount: Deactivated successfully. Jan 15 13:45:36.888493 kubelet[2711]: E0115 13:45:36.888107 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:38.888607 kubelet[2711]: E0115 13:45:38.888200 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:40.888382 kubelet[2711]: E0115 13:45:40.888243 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:41.209572 containerd[1513]: time="2025-01-15T13:45:41.209275000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:41.210945 containerd[1513]: time="2025-01-15T13:45:41.210880418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 15 13:45:41.212259 containerd[1513]: time="2025-01-15T13:45:41.212133743Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:41.215035 containerd[1513]: time="2025-01-15T13:45:41.215001232Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:41.216585 containerd[1513]: time="2025-01-15T13:45:41.216349306Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 6.166493083s" Jan 15 13:45:41.216585 containerd[1513]: time="2025-01-15T13:45:41.216390993Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 15 13:45:41.220644 containerd[1513]: time="2025-01-15T13:45:41.220516748Z" level=info msg="CreateContainer within sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 15 13:45:41.239857 containerd[1513]: time="2025-01-15T13:45:41.239485884Z" level=info msg="CreateContainer within sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\"" Jan 15 13:45:41.241595 containerd[1513]: time="2025-01-15T13:45:41.241430526Z" level=info msg="StartContainer for \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\"" Jan 15 13:45:41.242994 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1450446827.mount: Deactivated successfully. Jan 15 13:45:41.313363 systemd[1]: run-containerd-runc-k8s.io-70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417-runc.s642vb.mount: Deactivated successfully. Jan 15 13:45:41.324220 systemd[1]: Started cri-containerd-70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417.scope - libcontainer container 70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417. Jan 15 13:45:41.378034 containerd[1513]: time="2025-01-15T13:45:41.377881956Z" level=info msg="StartContainer for \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\" returns successfully" Jan 15 13:45:42.259079 systemd[1]: cri-containerd-70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417.scope: Deactivated successfully. Jan 15 13:45:42.307093 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417-rootfs.mount: Deactivated successfully. Jan 15 13:45:42.383167 kubelet[2711]: I0115 13:45:42.379901 2711 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 15 13:45:42.437846 containerd[1513]: time="2025-01-15T13:45:42.437694969Z" level=info msg="shim disconnected" id=70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417 namespace=k8s.io Jan 15 13:45:42.437846 containerd[1513]: time="2025-01-15T13:45:42.437798680Z" level=warning msg="cleaning up after shim disconnected" id=70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417 namespace=k8s.io Jan 15 13:45:42.437846 containerd[1513]: time="2025-01-15T13:45:42.437823253Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:45:42.472817 kubelet[2711]: I0115 13:45:42.472734 2711 topology_manager.go:215] "Topology Admit Handler" podUID="2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a" podNamespace="kube-system" podName="coredns-7db6d8ff4d-qstwj" Jan 15 13:45:42.483036 kubelet[2711]: I0115 13:45:42.482974 2711 topology_manager.go:215] "Topology Admit Handler" podUID="299bee91-9829-4be7-9e66-0041031d6397" podNamespace="kube-system" podName="coredns-7db6d8ff4d-2w45s" Jan 15 13:45:42.488780 kubelet[2711]: I0115 13:45:42.487414 2711 topology_manager.go:215] "Topology Admit Handler" podUID="9c55259c-6a2f-4fd1-8729-52141d279855" podNamespace="calico-system" podName="calico-kube-controllers-8496b9bc76-lh2fq" Jan 15 13:45:42.491310 kubelet[2711]: I0115 13:45:42.491121 2711 topology_manager.go:215] "Topology Admit Handler" podUID="9de999de-69d1-4aa2-96d0-0c16e7b716ad" podNamespace="calico-apiserver" podName="calico-apiserver-54c9c669d7-zgh9j" Jan 15 13:45:42.493988 kubelet[2711]: I0115 13:45:42.493828 2711 topology_manager.go:215] "Topology Admit Handler" podUID="403d6493-86a8-45cb-bcf7-b66df6eeb925" podNamespace="calico-apiserver" podName="calico-apiserver-54c9c669d7-ls2h6" Jan 15 13:45:42.508392 systemd[1]: Created slice kubepods-burstable-pod2cd6f230_e12d_4e00_8ac3_2b9a6443dd5a.slice - libcontainer container kubepods-burstable-pod2cd6f230_e12d_4e00_8ac3_2b9a6443dd5a.slice. Jan 15 13:45:42.515019 kubelet[2711]: W0115 13:45:42.514656 2711 reflector.go:547] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-6yg2e.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'srv-6yg2e.gb1.brightbox.com' and this object Jan 15 13:45:42.515019 kubelet[2711]: E0115 13:45:42.514714 2711 reflector.go:150] object-"calico-apiserver"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-6yg2e.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'srv-6yg2e.gb1.brightbox.com' and this object Jan 15 13:45:42.529564 systemd[1]: Created slice kubepods-burstable-pod299bee91_9829_4be7_9e66_0041031d6397.slice - libcontainer container kubepods-burstable-pod299bee91_9829_4be7_9e66_0041031d6397.slice. Jan 15 13:45:42.544893 systemd[1]: Created slice kubepods-besteffort-pod9c55259c_6a2f_4fd1_8729_52141d279855.slice - libcontainer container kubepods-besteffort-pod9c55259c_6a2f_4fd1_8729_52141d279855.slice. Jan 15 13:45:42.565603 kubelet[2711]: I0115 13:45:42.565559 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwxm\" (UniqueName: \"kubernetes.io/projected/299bee91-9829-4be7-9e66-0041031d6397-kube-api-access-4dwxm\") pod \"coredns-7db6d8ff4d-2w45s\" (UID: \"299bee91-9829-4be7-9e66-0041031d6397\") " pod="kube-system/coredns-7db6d8ff4d-2w45s" Jan 15 13:45:42.566086 kubelet[2711]: I0115 13:45:42.565701 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a-config-volume\") pod \"coredns-7db6d8ff4d-qstwj\" (UID: \"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a\") " pod="kube-system/coredns-7db6d8ff4d-qstwj" Jan 15 13:45:42.566086 kubelet[2711]: I0115 13:45:42.565961 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kz4d\" (UniqueName: \"kubernetes.io/projected/2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a-kube-api-access-6kz4d\") pod \"coredns-7db6d8ff4d-qstwj\" (UID: \"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a\") " pod="kube-system/coredns-7db6d8ff4d-qstwj" Jan 15 13:45:42.567299 kubelet[2711]: I0115 13:45:42.567182 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm5vr\" (UniqueName: \"kubernetes.io/projected/403d6493-86a8-45cb-bcf7-b66df6eeb925-kube-api-access-gm5vr\") pod \"calico-apiserver-54c9c669d7-ls2h6\" (UID: \"403d6493-86a8-45cb-bcf7-b66df6eeb925\") " pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" Jan 15 13:45:42.566551 systemd[1]: Created slice kubepods-besteffort-pod9de999de_69d1_4aa2_96d0_0c16e7b716ad.slice - libcontainer container kubepods-besteffort-pod9de999de_69d1_4aa2_96d0_0c16e7b716ad.slice. Jan 15 13:45:42.569113 kubelet[2711]: I0115 13:45:42.567574 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjk5\" (UniqueName: \"kubernetes.io/projected/9c55259c-6a2f-4fd1-8729-52141d279855-kube-api-access-dqjk5\") pod \"calico-kube-controllers-8496b9bc76-lh2fq\" (UID: \"9c55259c-6a2f-4fd1-8729-52141d279855\") " pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" Jan 15 13:45:42.569113 kubelet[2711]: I0115 13:45:42.567656 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9de999de-69d1-4aa2-96d0-0c16e7b716ad-calico-apiserver-certs\") pod \"calico-apiserver-54c9c669d7-zgh9j\" (UID: \"9de999de-69d1-4aa2-96d0-0c16e7b716ad\") " pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" Jan 15 13:45:42.569113 kubelet[2711]: I0115 13:45:42.567692 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4588j\" (UniqueName: \"kubernetes.io/projected/9de999de-69d1-4aa2-96d0-0c16e7b716ad-kube-api-access-4588j\") pod \"calico-apiserver-54c9c669d7-zgh9j\" (UID: \"9de999de-69d1-4aa2-96d0-0c16e7b716ad\") " pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" Jan 15 13:45:42.569113 kubelet[2711]: I0115 13:45:42.567730 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c55259c-6a2f-4fd1-8729-52141d279855-tigera-ca-bundle\") pod \"calico-kube-controllers-8496b9bc76-lh2fq\" (UID: \"9c55259c-6a2f-4fd1-8729-52141d279855\") " pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" Jan 15 13:45:42.569113 kubelet[2711]: I0115 13:45:42.567764 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/299bee91-9829-4be7-9e66-0041031d6397-config-volume\") pod \"coredns-7db6d8ff4d-2w45s\" (UID: \"299bee91-9829-4be7-9e66-0041031d6397\") " pod="kube-system/coredns-7db6d8ff4d-2w45s" Jan 15 13:45:42.569433 kubelet[2711]: I0115 13:45:42.567792 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/403d6493-86a8-45cb-bcf7-b66df6eeb925-calico-apiserver-certs\") pod \"calico-apiserver-54c9c669d7-ls2h6\" (UID: \"403d6493-86a8-45cb-bcf7-b66df6eeb925\") " pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" Jan 15 13:45:42.578299 systemd[1]: Created slice kubepods-besteffort-pod403d6493_86a8_45cb_bcf7_b66df6eeb925.slice - libcontainer container kubepods-besteffort-pod403d6493_86a8_45cb_bcf7_b66df6eeb925.slice. Jan 15 13:45:42.821423 containerd[1513]: time="2025-01-15T13:45:42.820677798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qstwj,Uid:2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a,Namespace:kube-system,Attempt:0,}" Jan 15 13:45:42.851728 containerd[1513]: time="2025-01-15T13:45:42.851195530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2w45s,Uid:299bee91-9829-4be7-9e66-0041031d6397,Namespace:kube-system,Attempt:0,}" Jan 15 13:45:42.862975 containerd[1513]: time="2025-01-15T13:45:42.861561961Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8496b9bc76-lh2fq,Uid:9c55259c-6a2f-4fd1-8729-52141d279855,Namespace:calico-system,Attempt:0,}" Jan 15 13:45:42.902763 systemd[1]: Created slice kubepods-besteffort-pod9dd88286_18b7_4e6e_a5d3_8c847dab96ba.slice - libcontainer container kubepods-besteffort-pod9dd88286_18b7_4e6e_a5d3_8c847dab96ba.slice. Jan 15 13:45:42.918027 containerd[1513]: time="2025-01-15T13:45:42.917976069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zz4wr,Uid:9dd88286-18b7-4e6e-a5d3-8c847dab96ba,Namespace:calico-system,Attempt:0,}" Jan 15 13:45:43.111664 containerd[1513]: time="2025-01-15T13:45:43.109986720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 15 13:45:43.187966 containerd[1513]: time="2025-01-15T13:45:43.182390413Z" level=error msg="Failed to destroy network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.197945 containerd[1513]: time="2025-01-15T13:45:43.197876063Z" level=error msg="Failed to destroy network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.198658 containerd[1513]: time="2025-01-15T13:45:43.198189601Z" level=error msg="encountered an error cleaning up failed sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.198658 containerd[1513]: time="2025-01-15T13:45:43.198272633Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8496b9bc76-lh2fq,Uid:9c55259c-6a2f-4fd1-8729-52141d279855,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.198658 containerd[1513]: time="2025-01-15T13:45:43.198577420Z" level=error msg="encountered an error cleaning up failed sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.198658 containerd[1513]: time="2025-01-15T13:45:43.198633065Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2w45s,Uid:299bee91-9829-4be7-9e66-0041031d6397,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.205134 containerd[1513]: time="2025-01-15T13:45:43.205077690Z" level=error msg="Failed to destroy network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.206658 containerd[1513]: time="2025-01-15T13:45:43.205584064Z" level=error msg="encountered an error cleaning up failed sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.206658 containerd[1513]: time="2025-01-15T13:45:43.205651820Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qstwj,Uid:2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.206781 kubelet[2711]: E0115 13:45:43.206106 2711 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.206781 kubelet[2711]: E0115 13:45:43.206200 2711 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" Jan 15 13:45:43.206781 kubelet[2711]: E0115 13:45:43.206246 2711 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" Jan 15 13:45:43.206966 kubelet[2711]: E0115 13:45:43.206312 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8496b9bc76-lh2fq_calico-system(9c55259c-6a2f-4fd1-8729-52141d279855)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8496b9bc76-lh2fq_calico-system(9c55259c-6a2f-4fd1-8729-52141d279855)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" podUID="9c55259c-6a2f-4fd1-8729-52141d279855" Jan 15 13:45:43.206966 kubelet[2711]: E0115 13:45:43.206003 2711 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.206966 kubelet[2711]: E0115 13:45:43.206537 2711 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-qstwj" Jan 15 13:45:43.207163 kubelet[2711]: E0115 13:45:43.206563 2711 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-qstwj" Jan 15 13:45:43.207163 kubelet[2711]: E0115 13:45:43.206602 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-qstwj_kube-system(2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-qstwj_kube-system(2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-qstwj" podUID="2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a" Jan 15 13:45:43.207163 kubelet[2711]: E0115 13:45:43.206653 2711 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.207325 kubelet[2711]: E0115 13:45:43.206683 2711 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-2w45s" Jan 15 13:45:43.207325 kubelet[2711]: E0115 13:45:43.206708 2711 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-2w45s" Jan 15 13:45:43.207325 kubelet[2711]: E0115 13:45:43.206743 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-2w45s_kube-system(299bee91-9829-4be7-9e66-0041031d6397)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-2w45s_kube-system(299bee91-9829-4be7-9e66-0041031d6397)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-2w45s" podUID="299bee91-9829-4be7-9e66-0041031d6397" Jan 15 13:45:43.217753 containerd[1513]: time="2025-01-15T13:45:43.217688849Z" level=error msg="Failed to destroy network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.218299 containerd[1513]: time="2025-01-15T13:45:43.218242590Z" level=error msg="encountered an error cleaning up failed sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.218428 containerd[1513]: time="2025-01-15T13:45:43.218314838Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zz4wr,Uid:9dd88286-18b7-4e6e-a5d3-8c847dab96ba,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.218715 kubelet[2711]: E0115 13:45:43.218603 2711 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:43.219400 kubelet[2711]: E0115 13:45:43.218803 2711 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:43.219400 kubelet[2711]: E0115 13:45:43.218847 2711 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zz4wr" Jan 15 13:45:43.219400 kubelet[2711]: E0115 13:45:43.218923 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zz4wr_calico-system(9dd88286-18b7-4e6e-a5d3-8c847dab96ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zz4wr_calico-system(9dd88286-18b7-4e6e-a5d3-8c847dab96ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:43.685743 kubelet[2711]: E0115 13:45:43.685668 2711 projected.go:294] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 15 13:45:43.686517 kubelet[2711]: E0115 13:45:43.685751 2711 projected.go:200] Error preparing data for projected volume kube-api-access-4588j for pod calico-apiserver/calico-apiserver-54c9c669d7-zgh9j: failed to sync configmap cache: timed out waiting for the condition Jan 15 13:45:43.686517 kubelet[2711]: E0115 13:45:43.685924 2711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9de999de-69d1-4aa2-96d0-0c16e7b716ad-kube-api-access-4588j podName:9de999de-69d1-4aa2-96d0-0c16e7b716ad nodeName:}" failed. No retries permitted until 2025-01-15 13:45:44.185858657 +0000 UTC m=+38.438536737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4588j" (UniqueName: "kubernetes.io/projected/9de999de-69d1-4aa2-96d0-0c16e7b716ad-kube-api-access-4588j") pod "calico-apiserver-54c9c669d7-zgh9j" (UID: "9de999de-69d1-4aa2-96d0-0c16e7b716ad") : failed to sync configmap cache: timed out waiting for the condition Jan 15 13:45:43.696318 kubelet[2711]: E0115 13:45:43.696157 2711 projected.go:294] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 15 13:45:43.696318 kubelet[2711]: E0115 13:45:43.696203 2711 projected.go:200] Error preparing data for projected volume kube-api-access-gm5vr for pod calico-apiserver/calico-apiserver-54c9c669d7-ls2h6: failed to sync configmap cache: timed out waiting for the condition Jan 15 13:45:43.696318 kubelet[2711]: E0115 13:45:43.696305 2711 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/403d6493-86a8-45cb-bcf7-b66df6eeb925-kube-api-access-gm5vr podName:403d6493-86a8-45cb-bcf7-b66df6eeb925 nodeName:}" failed. No retries permitted until 2025-01-15 13:45:44.196273791 +0000 UTC m=+38.448951863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gm5vr" (UniqueName: "kubernetes.io/projected/403d6493-86a8-45cb-bcf7-b66df6eeb925-kube-api-access-gm5vr") pod "calico-apiserver-54c9c669d7-ls2h6" (UID: "403d6493-86a8-45cb-bcf7-b66df6eeb925") : failed to sync configmap cache: timed out waiting for the condition Jan 15 13:45:44.107993 kubelet[2711]: I0115 13:45:44.107918 2711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:45:44.110205 kubelet[2711]: I0115 13:45:44.110007 2711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:45:44.118125 containerd[1513]: time="2025-01-15T13:45:44.116374145Z" level=info msg="StopPodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\"" Jan 15 13:45:44.118662 containerd[1513]: time="2025-01-15T13:45:44.118124813Z" level=info msg="Ensure that sandbox 0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d in task-service has been cleanup successfully" Jan 15 13:45:44.119486 containerd[1513]: time="2025-01-15T13:45:44.118800581Z" level=info msg="StopPodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\"" Jan 15 13:45:44.119486 containerd[1513]: time="2025-01-15T13:45:44.119141602Z" level=info msg="Ensure that sandbox 480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606 in task-service has been cleanup successfully" Jan 15 13:45:44.120198 kubelet[2711]: I0115 13:45:44.120159 2711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:45:44.121416 containerd[1513]: time="2025-01-15T13:45:44.121370038Z" level=info msg="StopPodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\"" Jan 15 13:45:44.121720 containerd[1513]: time="2025-01-15T13:45:44.121581867Z" level=info msg="Ensure that sandbox 3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea in task-service has been cleanup successfully" Jan 15 13:45:44.126748 kubelet[2711]: I0115 13:45:44.126673 2711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:45:44.128566 containerd[1513]: time="2025-01-15T13:45:44.127935702Z" level=info msg="StopPodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\"" Jan 15 13:45:44.128566 containerd[1513]: time="2025-01-15T13:45:44.128393128Z" level=info msg="Ensure that sandbox e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7 in task-service has been cleanup successfully" Jan 15 13:45:44.206004 containerd[1513]: time="2025-01-15T13:45:44.205479344Z" level=error msg="StopPodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" failed" error="failed to destroy network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.206004 containerd[1513]: time="2025-01-15T13:45:44.205979644Z" level=error msg="StopPodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" failed" error="failed to destroy network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.206683 kubelet[2711]: E0115 13:45:44.206343 2711 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:45:44.208144 kubelet[2711]: E0115 13:45:44.206704 2711 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7"} Jan 15 13:45:44.208144 kubelet[2711]: E0115 13:45:44.206833 2711 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 15 13:45:44.208144 kubelet[2711]: E0115 13:45:44.206870 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-qstwj" podUID="2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a" Jan 15 13:45:44.208144 kubelet[2711]: E0115 13:45:44.208092 2711 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:45:44.208144 kubelet[2711]: E0115 13:45:44.208132 2711 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606"} Jan 15 13:45:44.209310 kubelet[2711]: E0115 13:45:44.208167 2711 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 15 13:45:44.209310 kubelet[2711]: E0115 13:45:44.208270 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9dd88286-18b7-4e6e-a5d3-8c847dab96ba\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zz4wr" podUID="9dd88286-18b7-4e6e-a5d3-8c847dab96ba" Jan 15 13:45:44.220522 containerd[1513]: time="2025-01-15T13:45:44.220428050Z" level=error msg="StopPodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" failed" error="failed to destroy network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.220848 containerd[1513]: time="2025-01-15T13:45:44.220642238Z" level=error msg="StopPodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" failed" error="failed to destroy network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.220981 kubelet[2711]: E0115 13:45:44.220925 2711 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:45:44.221058 kubelet[2711]: E0115 13:45:44.221005 2711 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea"} Jan 15 13:45:44.221596 kubelet[2711]: E0115 13:45:44.221056 2711 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"299bee91-9829-4be7-9e66-0041031d6397\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 15 13:45:44.221596 kubelet[2711]: E0115 13:45:44.221093 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"299bee91-9829-4be7-9e66-0041031d6397\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-2w45s" podUID="299bee91-9829-4be7-9e66-0041031d6397" Jan 15 13:45:44.221596 kubelet[2711]: E0115 13:45:44.221270 2711 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:45:44.221596 kubelet[2711]: E0115 13:45:44.221307 2711 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d"} Jan 15 13:45:44.221927 kubelet[2711]: E0115 13:45:44.221339 2711 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9c55259c-6a2f-4fd1-8729-52141d279855\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 15 13:45:44.221927 kubelet[2711]: E0115 13:45:44.221422 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9c55259c-6a2f-4fd1-8729-52141d279855\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" podUID="9c55259c-6a2f-4fd1-8729-52141d279855" Jan 15 13:45:44.373746 containerd[1513]: time="2025-01-15T13:45:44.373485844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-zgh9j,Uid:9de999de-69d1-4aa2-96d0-0c16e7b716ad,Namespace:calico-apiserver,Attempt:0,}" Jan 15 13:45:44.387049 containerd[1513]: time="2025-01-15T13:45:44.386986112Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-ls2h6,Uid:403d6493-86a8-45cb-bcf7-b66df6eeb925,Namespace:calico-apiserver,Attempt:0,}" Jan 15 13:45:44.493462 containerd[1513]: time="2025-01-15T13:45:44.491052303Z" level=error msg="Failed to destroy network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.496555 containerd[1513]: time="2025-01-15T13:45:44.493789920Z" level=error msg="encountered an error cleaning up failed sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.496555 containerd[1513]: time="2025-01-15T13:45:44.493880217Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-zgh9j,Uid:9de999de-69d1-4aa2-96d0-0c16e7b716ad,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.496737 kubelet[2711]: E0115 13:45:44.494166 2711 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.496737 kubelet[2711]: E0115 13:45:44.494247 2711 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" Jan 15 13:45:44.496737 kubelet[2711]: E0115 13:45:44.494280 2711 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" Jan 15 13:45:44.497575 kubelet[2711]: E0115 13:45:44.494351 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54c9c669d7-zgh9j_calico-apiserver(9de999de-69d1-4aa2-96d0-0c16e7b716ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54c9c669d7-zgh9j_calico-apiserver(9de999de-69d1-4aa2-96d0-0c16e7b716ad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" podUID="9de999de-69d1-4aa2-96d0-0c16e7b716ad" Jan 15 13:45:44.498982 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853-shm.mount: Deactivated successfully. Jan 15 13:45:44.505463 containerd[1513]: time="2025-01-15T13:45:44.504245483Z" level=error msg="Failed to destroy network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.509034 containerd[1513]: time="2025-01-15T13:45:44.508976541Z" level=error msg="encountered an error cleaning up failed sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.509128 containerd[1513]: time="2025-01-15T13:45:44.509095736Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-ls2h6,Uid:403d6493-86a8-45cb-bcf7-b66df6eeb925,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.509516 kubelet[2711]: E0115 13:45:44.509470 2711 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:44.509603 kubelet[2711]: E0115 13:45:44.509545 2711 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" Jan 15 13:45:44.509603 kubelet[2711]: E0115 13:45:44.509582 2711 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" Jan 15 13:45:44.509693 kubelet[2711]: E0115 13:45:44.509645 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54c9c669d7-ls2h6_calico-apiserver(403d6493-86a8-45cb-bcf7-b66df6eeb925)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54c9c669d7-ls2h6_calico-apiserver(403d6493-86a8-45cb-bcf7-b66df6eeb925)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" podUID="403d6493-86a8-45cb-bcf7-b66df6eeb925" Jan 15 13:45:44.512173 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26-shm.mount: Deactivated successfully. Jan 15 13:45:45.130966 kubelet[2711]: I0115 13:45:45.130909 2711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:45:45.140717 containerd[1513]: time="2025-01-15T13:45:45.139971773Z" level=info msg="StopPodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\"" Jan 15 13:45:45.140717 containerd[1513]: time="2025-01-15T13:45:45.140311570Z" level=info msg="Ensure that sandbox 05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26 in task-service has been cleanup successfully" Jan 15 13:45:45.141853 kubelet[2711]: I0115 13:45:45.141792 2711 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:45:45.143414 containerd[1513]: time="2025-01-15T13:45:45.143242596Z" level=info msg="StopPodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\"" Jan 15 13:45:45.143530 containerd[1513]: time="2025-01-15T13:45:45.143494339Z" level=info msg="Ensure that sandbox b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853 in task-service has been cleanup successfully" Jan 15 13:45:45.229064 containerd[1513]: time="2025-01-15T13:45:45.228407251Z" level=error msg="StopPodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" failed" error="failed to destroy network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:45.229247 kubelet[2711]: E0115 13:45:45.228825 2711 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:45:45.229247 kubelet[2711]: E0115 13:45:45.228909 2711 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853"} Jan 15 13:45:45.229247 kubelet[2711]: E0115 13:45:45.228962 2711 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9de999de-69d1-4aa2-96d0-0c16e7b716ad\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 15 13:45:45.229247 kubelet[2711]: E0115 13:45:45.228996 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9de999de-69d1-4aa2-96d0-0c16e7b716ad\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" podUID="9de999de-69d1-4aa2-96d0-0c16e7b716ad" Jan 15 13:45:45.230629 containerd[1513]: time="2025-01-15T13:45:45.230551398Z" level=error msg="StopPodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" failed" error="failed to destroy network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 13:45:45.231153 kubelet[2711]: E0115 13:45:45.230750 2711 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:45:45.231153 kubelet[2711]: E0115 13:45:45.230808 2711 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26"} Jan 15 13:45:45.231153 kubelet[2711]: E0115 13:45:45.230847 2711 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"403d6493-86a8-45cb-bcf7-b66df6eeb925\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 15 13:45:45.231153 kubelet[2711]: E0115 13:45:45.230875 2711 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"403d6493-86a8-45cb-bcf7-b66df6eeb925\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" podUID="403d6493-86a8-45cb-bcf7-b66df6eeb925" Jan 15 13:45:52.579567 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2439669212.mount: Deactivated successfully. Jan 15 13:45:52.708175 containerd[1513]: time="2025-01-15T13:45:52.707122267Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 15 13:45:52.724103 containerd[1513]: time="2025-01-15T13:45:52.724009261Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 9.603116587s" Jan 15 13:45:52.724103 containerd[1513]: time="2025-01-15T13:45:52.724066331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:52.724661 containerd[1513]: time="2025-01-15T13:45:52.724081302Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 15 13:45:52.750949 containerd[1513]: time="2025-01-15T13:45:52.750835258Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:52.759947 containerd[1513]: time="2025-01-15T13:45:52.759249230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:52.833873 containerd[1513]: time="2025-01-15T13:45:52.833234849Z" level=info msg="CreateContainer within sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 15 13:45:52.900692 containerd[1513]: time="2025-01-15T13:45:52.900620923Z" level=info msg="CreateContainer within sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\"" Jan 15 13:45:52.905350 containerd[1513]: time="2025-01-15T13:45:52.905309992Z" level=info msg="StartContainer for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\"" Jan 15 13:45:53.085651 systemd[1]: Started cri-containerd-0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e.scope - libcontainer container 0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e. Jan 15 13:45:53.160908 containerd[1513]: time="2025-01-15T13:45:53.160848106Z" level=info msg="StartContainer for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" returns successfully" Jan 15 13:45:53.382763 kubelet[2711]: I0115 13:45:53.380890 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5l5br" podStartSLOduration=1.74305999 podStartE2EDuration="25.250808144s" podCreationTimestamp="2025-01-15 13:45:28 +0000 UTC" firstStartedPulling="2025-01-15 13:45:29.243528494 +0000 UTC m=+23.496206566" lastFinishedPulling="2025-01-15 13:45:52.751276641 +0000 UTC m=+47.003954720" observedRunningTime="2025-01-15 13:45:53.24312809 +0000 UTC m=+47.495806183" watchObservedRunningTime="2025-01-15 13:45:53.250808144 +0000 UTC m=+47.503486246" Jan 15 13:45:53.505374 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 15 13:45:53.506858 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 15 13:45:54.889337 containerd[1513]: time="2025-01-15T13:45:54.889204048Z" level=info msg="StopPodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\"" Jan 15 13:45:54.892147 containerd[1513]: time="2025-01-15T13:45:54.889362510Z" level=info msg="StopPodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\"" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.017 [INFO][3923] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.019 [INFO][3923] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" iface="eth0" netns="/var/run/netns/cni-6f8debd5-ec2a-03bb-d154-d549684e7b78" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.020 [INFO][3923] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" iface="eth0" netns="/var/run/netns/cni-6f8debd5-ec2a-03bb-d154-d549684e7b78" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.027 [INFO][3923] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" iface="eth0" netns="/var/run/netns/cni-6f8debd5-ec2a-03bb-d154-d549684e7b78" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.027 [INFO][3923] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.027 [INFO][3923] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.325 [INFO][3959] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.340 [INFO][3959] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.340 [INFO][3959] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.360 [WARNING][3959] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.360 [INFO][3959] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.366 [INFO][3959] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:55.381761 containerd[1513]: 2025-01-15 13:45:55.373 [INFO][3923] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:45:55.396655 systemd[1]: run-netns-cni\x2d6f8debd5\x2dec2a\x2d03bb\x2dd154\x2dd549684e7b78.mount: Deactivated successfully. Jan 15 13:45:55.410359 containerd[1513]: time="2025-01-15T13:45:55.410284469Z" level=info msg="TearDown network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" successfully" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.022 [INFO][3922] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.026 [INFO][3922] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" iface="eth0" netns="/var/run/netns/cni-890d2ba5-d4ec-6cc7-1cb7-56e26289251c" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.027 [INFO][3922] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" iface="eth0" netns="/var/run/netns/cni-890d2ba5-d4ec-6cc7-1cb7-56e26289251c" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.028 [INFO][3922] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" iface="eth0" netns="/var/run/netns/cni-890d2ba5-d4ec-6cc7-1cb7-56e26289251c" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.028 [INFO][3922] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.028 [INFO][3922] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.325 [INFO][3960] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.341 [INFO][3960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.366 [INFO][3960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.379 [WARNING][3960] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.380 [INFO][3960] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.384 [INFO][3960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:55.412718 containerd[1513]: 2025-01-15 13:45:55.408 [INFO][3922] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:45:55.417374 containerd[1513]: time="2025-01-15T13:45:55.410717069Z" level=info msg="StopPodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" returns successfully" Jan 15 13:45:55.417374 containerd[1513]: time="2025-01-15T13:45:55.413398615Z" level=info msg="TearDown network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" successfully" Jan 15 13:45:55.417374 containerd[1513]: time="2025-01-15T13:45:55.416637759Z" level=info msg="StopPodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" returns successfully" Jan 15 13:45:55.423317 systemd[1]: run-netns-cni\x2d890d2ba5\x2dd4ec\x2d6cc7\x2d1cb7\x2d56e26289251c.mount: Deactivated successfully. Jan 15 13:45:55.438662 containerd[1513]: time="2025-01-15T13:45:55.438412713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zz4wr,Uid:9dd88286-18b7-4e6e-a5d3-8c847dab96ba,Namespace:calico-system,Attempt:1,}" Jan 15 13:45:55.444233 containerd[1513]: time="2025-01-15T13:45:55.438578416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qstwj,Uid:2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a,Namespace:kube-system,Attempt:1,}" Jan 15 13:45:55.805780 systemd-networkd[1436]: cali48ccfec3b47: Link UP Jan 15 13:45:55.808878 systemd-networkd[1436]: cali48ccfec3b47: Gained carrier Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.615 [INFO][4062] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.651 [INFO][4062] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0 csi-node-driver- calico-system 9dd88286-18b7-4e6e-a5d3-8c847dab96ba 788 0 2025-01-15 13:45:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com csi-node-driver-zz4wr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali48ccfec3b47 [] []}} ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.651 [INFO][4062] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.720 [INFO][4090] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" HandleID="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.741 [INFO][4090] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" HandleID="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003058d0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"csi-node-driver-zz4wr", "timestamp":"2025-01-15 13:45:55.720090369 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.741 [INFO][4090] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.741 [INFO][4090] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.741 [INFO][4090] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.745 [INFO][4090] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.753 [INFO][4090] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.759 [INFO][4090] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.763 [INFO][4090] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.766 [INFO][4090] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.766 [INFO][4090] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.768 [INFO][4090] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.773 [INFO][4090] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.781 [INFO][4090] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.193/26] block=192.168.95.192/26 handle="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.781 [INFO][4090] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.193/26] handle="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.781 [INFO][4090] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:55.870097 containerd[1513]: 2025-01-15 13:45:55.781 [INFO][4090] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.193/26] IPv6=[] ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" HandleID="k8s-pod-network.dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.875250 containerd[1513]: 2025-01-15 13:45:55.786 [INFO][4062] cni-plugin/k8s.go 386: Populated endpoint ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9dd88286-18b7-4e6e-a5d3-8c847dab96ba", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-zz4wr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali48ccfec3b47", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:55.875250 containerd[1513]: 2025-01-15 13:45:55.786 [INFO][4062] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.193/32] ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.875250 containerd[1513]: 2025-01-15 13:45:55.787 [INFO][4062] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali48ccfec3b47 ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.875250 containerd[1513]: 2025-01-15 13:45:55.808 [INFO][4062] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.875250 containerd[1513]: 2025-01-15 13:45:55.811 [INFO][4062] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9dd88286-18b7-4e6e-a5d3-8c847dab96ba", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c", Pod:"csi-node-driver-zz4wr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali48ccfec3b47", MAC:"aa:00:57:bc:ec:59", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:55.875250 containerd[1513]: 2025-01-15 13:45:55.850 [INFO][4062] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c" Namespace="calico-system" Pod="csi-node-driver-zz4wr" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:45:55.891042 systemd-networkd[1436]: cali93d08021791: Link UP Jan 15 13:45:55.891384 systemd-networkd[1436]: cali93d08021791: Gained carrier Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.605 [INFO][4056] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.632 [INFO][4056] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0 coredns-7db6d8ff4d- kube-system 2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a 787 0 2025-01-15 13:45:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com coredns-7db6d8ff4d-qstwj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali93d08021791 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.633 [INFO][4056] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.726 [INFO][4086] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" HandleID="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.744 [INFO][4086] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" HandleID="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00043f2a0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"coredns-7db6d8ff4d-qstwj", "timestamp":"2025-01-15 13:45:55.72604148 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.744 [INFO][4086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.782 [INFO][4086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.782 [INFO][4086] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.785 [INFO][4086] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.796 [INFO][4086] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.814 [INFO][4086] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.835 [INFO][4086] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.842 [INFO][4086] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.843 [INFO][4086] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.846 [INFO][4086] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27 Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.858 [INFO][4086] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.868 [INFO][4086] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.194/26] block=192.168.95.192/26 handle="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.868 [INFO][4086] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.194/26] handle="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.868 [INFO][4086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:55.922248 containerd[1513]: 2025-01-15 13:45:55.868 [INFO][4086] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.194/26] IPv6=[] ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" HandleID="k8s-pod-network.b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.924329 containerd[1513]: 2025-01-15 13:45:55.876 [INFO][4056] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7db6d8ff4d-qstwj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93d08021791", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:55.924329 containerd[1513]: 2025-01-15 13:45:55.877 [INFO][4056] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.194/32] ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.924329 containerd[1513]: 2025-01-15 13:45:55.877 [INFO][4056] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali93d08021791 ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.924329 containerd[1513]: 2025-01-15 13:45:55.888 [INFO][4056] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.924329 containerd[1513]: 2025-01-15 13:45:55.888 [INFO][4056] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27", Pod:"coredns-7db6d8ff4d-qstwj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93d08021791", MAC:"62:62:50:0b:6a:3b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:55.924329 containerd[1513]: 2025-01-15 13:45:55.917 [INFO][4056] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qstwj" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:45:55.961166 containerd[1513]: time="2025-01-15T13:45:55.960760448Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:55.961166 containerd[1513]: time="2025-01-15T13:45:55.960878831Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:55.961166 containerd[1513]: time="2025-01-15T13:45:55.960898761Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:55.961484 containerd[1513]: time="2025-01-15T13:45:55.961070603Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:55.975037 containerd[1513]: time="2025-01-15T13:45:55.974574910Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:55.975037 containerd[1513]: time="2025-01-15T13:45:55.974653255Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:55.975037 containerd[1513]: time="2025-01-15T13:45:55.974670154Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:55.975037 containerd[1513]: time="2025-01-15T13:45:55.974801577Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:55.997673 systemd[1]: Started cri-containerd-dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c.scope - libcontainer container dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c. Jan 15 13:45:56.012650 systemd[1]: Started cri-containerd-b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27.scope - libcontainer container b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27. Jan 15 13:45:56.098802 containerd[1513]: time="2025-01-15T13:45:56.098663918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zz4wr,Uid:9dd88286-18b7-4e6e-a5d3-8c847dab96ba,Namespace:calico-system,Attempt:1,} returns sandbox id \"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c\"" Jan 15 13:45:56.114707 containerd[1513]: time="2025-01-15T13:45:56.114649091Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 15 13:45:56.123649 containerd[1513]: time="2025-01-15T13:45:56.123605202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qstwj,Uid:2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a,Namespace:kube-system,Attempt:1,} returns sandbox id \"b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27\"" Jan 15 13:45:56.135154 containerd[1513]: time="2025-01-15T13:45:56.135093975Z" level=info msg="CreateContainer within sandbox \"b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 13:45:56.155278 containerd[1513]: time="2025-01-15T13:45:56.155211459Z" level=info msg="CreateContainer within sandbox \"b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"be7c9ce1fd9242fa51de60a1f3ea828bc11e2a06bab8a4f413337500f9d2d668\"" Jan 15 13:45:56.157567 containerd[1513]: time="2025-01-15T13:45:56.156313504Z" level=info msg="StartContainer for \"be7c9ce1fd9242fa51de60a1f3ea828bc11e2a06bab8a4f413337500f9d2d668\"" Jan 15 13:45:56.195790 systemd[1]: Started cri-containerd-be7c9ce1fd9242fa51de60a1f3ea828bc11e2a06bab8a4f413337500f9d2d668.scope - libcontainer container be7c9ce1fd9242fa51de60a1f3ea828bc11e2a06bab8a4f413337500f9d2d668. Jan 15 13:45:56.259877 containerd[1513]: time="2025-01-15T13:45:56.259825236Z" level=info msg="StartContainer for \"be7c9ce1fd9242fa51de60a1f3ea828bc11e2a06bab8a4f413337500f9d2d668\" returns successfully" Jan 15 13:45:57.239122 kubelet[2711]: I0115 13:45:57.237847 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-qstwj" podStartSLOduration=38.23781037 podStartE2EDuration="38.23781037s" podCreationTimestamp="2025-01-15 13:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:45:57.236998025 +0000 UTC m=+51.489676116" watchObservedRunningTime="2025-01-15 13:45:57.23781037 +0000 UTC m=+51.490488457" Jan 15 13:45:57.307174 systemd-networkd[1436]: cali93d08021791: Gained IPv6LL Jan 15 13:45:57.818913 systemd-networkd[1436]: cali48ccfec3b47: Gained IPv6LL Jan 15 13:45:57.891030 containerd[1513]: time="2025-01-15T13:45:57.890263313Z" level=info msg="StopPodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\"" Jan 15 13:45:57.892862 containerd[1513]: time="2025-01-15T13:45:57.892265146Z" level=info msg="StopPodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\"" Jan 15 13:45:57.917868 kubelet[2711]: I0115 13:45:57.917556 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.009 [INFO][4309] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.010 [INFO][4309] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" iface="eth0" netns="/var/run/netns/cni-a2113786-7044-830d-1f95-a9efd6a00b09" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.011 [INFO][4309] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" iface="eth0" netns="/var/run/netns/cni-a2113786-7044-830d-1f95-a9efd6a00b09" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.011 [INFO][4309] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" iface="eth0" netns="/var/run/netns/cni-a2113786-7044-830d-1f95-a9efd6a00b09" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.011 [INFO][4309] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.011 [INFO][4309] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.115 [INFO][4324] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.115 [INFO][4324] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.115 [INFO][4324] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.136 [WARNING][4324] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.136 [INFO][4324] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.138 [INFO][4324] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:58.161275 containerd[1513]: 2025-01-15 13:45:58.143 [INFO][4309] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:45:58.161275 containerd[1513]: time="2025-01-15T13:45:58.152603349Z" level=info msg="TearDown network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" successfully" Jan 15 13:45:58.161275 containerd[1513]: time="2025-01-15T13:45:58.152691643Z" level=info msg="StopPodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" returns successfully" Jan 15 13:45:58.161275 containerd[1513]: time="2025-01-15T13:45:58.156794681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8496b9bc76-lh2fq,Uid:9c55259c-6a2f-4fd1-8729-52141d279855,Namespace:calico-system,Attempt:1,}" Jan 15 13:45:58.164911 systemd[1]: run-netns-cni\x2da2113786\x2d7044\x2d830d\x2d1f95\x2da9efd6a00b09.mount: Deactivated successfully. Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.022 [INFO][4316] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.022 [INFO][4316] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" iface="eth0" netns="/var/run/netns/cni-bead732b-f9c5-d6a5-ca9d-9e12beb3032d" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.023 [INFO][4316] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" iface="eth0" netns="/var/run/netns/cni-bead732b-f9c5-d6a5-ca9d-9e12beb3032d" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.023 [INFO][4316] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" iface="eth0" netns="/var/run/netns/cni-bead732b-f9c5-d6a5-ca9d-9e12beb3032d" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.023 [INFO][4316] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.023 [INFO][4316] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.134 [INFO][4328] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.136 [INFO][4328] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.138 [INFO][4328] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.167 [WARNING][4328] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.168 [INFO][4328] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.171 [INFO][4328] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:58.177693 containerd[1513]: 2025-01-15 13:45:58.174 [INFO][4316] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:45:58.183202 containerd[1513]: time="2025-01-15T13:45:58.181306921Z" level=info msg="TearDown network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" successfully" Jan 15 13:45:58.183202 containerd[1513]: time="2025-01-15T13:45:58.181383620Z" level=info msg="StopPodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" returns successfully" Jan 15 13:45:58.182920 systemd[1]: run-netns-cni\x2dbead732b\x2df9c5\x2dd6a5\x2dca9d\x2d9e12beb3032d.mount: Deactivated successfully. Jan 15 13:45:58.186822 containerd[1513]: time="2025-01-15T13:45:58.186773987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-ls2h6,Uid:403d6493-86a8-45cb-bcf7-b66df6eeb925,Namespace:calico-apiserver,Attempt:1,}" Jan 15 13:45:58.506769 systemd-networkd[1436]: cali874334f92a8: Link UP Jan 15 13:45:58.507088 systemd-networkd[1436]: cali874334f92a8: Gained carrier Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.248 [INFO][4339] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.282 [INFO][4339] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0 calico-kube-controllers-8496b9bc76- calico-system 9c55259c-6a2f-4fd1-8729-52141d279855 818 0 2025-01-15 13:45:29 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8496b9bc76 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com calico-kube-controllers-8496b9bc76-lh2fq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali874334f92a8 [] []}} ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.282 [INFO][4339] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.402 [INFO][4368] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.434 [INFO][4368] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bd810), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"calico-kube-controllers-8496b9bc76-lh2fq", "timestamp":"2025-01-15 13:45:58.402174504 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.434 [INFO][4368] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.434 [INFO][4368] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.434 [INFO][4368] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.440 [INFO][4368] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.451 [INFO][4368] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.461 [INFO][4368] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.466 [INFO][4368] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.470 [INFO][4368] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.470 [INFO][4368] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.473 [INFO][4368] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6 Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.481 [INFO][4368] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.495 [INFO][4368] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.195/26] block=192.168.95.192/26 handle="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.495 [INFO][4368] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.195/26] handle="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.495 [INFO][4368] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:58.550470 containerd[1513]: 2025-01-15 13:45:58.495 [INFO][4368] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.195/26] IPv6=[] ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.554547 containerd[1513]: 2025-01-15 13:45:58.501 [INFO][4339] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0", GenerateName:"calico-kube-controllers-8496b9bc76-", Namespace:"calico-system", SelfLink:"", UID:"9c55259c-6a2f-4fd1-8729-52141d279855", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8496b9bc76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-8496b9bc76-lh2fq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali874334f92a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:58.554547 containerd[1513]: 2025-01-15 13:45:58.501 [INFO][4339] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.195/32] ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.554547 containerd[1513]: 2025-01-15 13:45:58.501 [INFO][4339] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali874334f92a8 ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.554547 containerd[1513]: 2025-01-15 13:45:58.505 [INFO][4339] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.554547 containerd[1513]: 2025-01-15 13:45:58.508 [INFO][4339] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0", GenerateName:"calico-kube-controllers-8496b9bc76-", Namespace:"calico-system", SelfLink:"", UID:"9c55259c-6a2f-4fd1-8729-52141d279855", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8496b9bc76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6", Pod:"calico-kube-controllers-8496b9bc76-lh2fq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali874334f92a8", MAC:"be:5e:12:fc:46:61", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:58.554547 containerd[1513]: 2025-01-15 13:45:58.535 [INFO][4339] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Namespace="calico-system" Pod="calico-kube-controllers-8496b9bc76-lh2fq" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:45:58.610222 systemd-networkd[1436]: calidb919b18e49: Link UP Jan 15 13:45:58.611762 systemd-networkd[1436]: calidb919b18e49: Gained carrier Jan 15 13:45:58.624549 containerd[1513]: time="2025-01-15T13:45:58.622826874Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:58.624549 containerd[1513]: time="2025-01-15T13:45:58.622909593Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:58.624549 containerd[1513]: time="2025-01-15T13:45:58.622927882Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:58.624549 containerd[1513]: time="2025-01-15T13:45:58.623052643Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.285 [INFO][4349] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.319 [INFO][4349] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0 calico-apiserver-54c9c669d7- calico-apiserver 403d6493-86a8-45cb-bcf7-b66df6eeb925 822 0 2025-01-15 13:45:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54c9c669d7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com calico-apiserver-54c9c669d7-ls2h6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidb919b18e49 [] []}} ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.320 [INFO][4349] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.410 [INFO][4375] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" HandleID="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.444 [INFO][4375] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" HandleID="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bcba0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"calico-apiserver-54c9c669d7-ls2h6", "timestamp":"2025-01-15 13:45:58.410883344 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.445 [INFO][4375] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.496 [INFO][4375] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.498 [INFO][4375] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.507 [INFO][4375] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.520 [INFO][4375] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.547 [INFO][4375] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.558 [INFO][4375] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.567 [INFO][4375] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.569 [INFO][4375] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.573 [INFO][4375] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1 Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.584 [INFO][4375] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.598 [INFO][4375] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.196/26] block=192.168.95.192/26 handle="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.599 [INFO][4375] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.196/26] handle="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.599 [INFO][4375] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:58.642634 containerd[1513]: 2025-01-15 13:45:58.599 [INFO][4375] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.196/26] IPv6=[] ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" HandleID="k8s-pod-network.61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.644021 containerd[1513]: 2025-01-15 13:45:58.603 [INFO][4349] cni-plugin/k8s.go 386: Populated endpoint ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"403d6493-86a8-45cb-bcf7-b66df6eeb925", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-54c9c669d7-ls2h6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb919b18e49", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:58.644021 containerd[1513]: 2025-01-15 13:45:58.603 [INFO][4349] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.196/32] ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.644021 containerd[1513]: 2025-01-15 13:45:58.603 [INFO][4349] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb919b18e49 ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.644021 containerd[1513]: 2025-01-15 13:45:58.609 [INFO][4349] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.644021 containerd[1513]: 2025-01-15 13:45:58.609 [INFO][4349] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"403d6493-86a8-45cb-bcf7-b66df6eeb925", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1", Pod:"calico-apiserver-54c9c669d7-ls2h6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb919b18e49", MAC:"fe:c6:5a:45:98:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:58.644021 containerd[1513]: 2025-01-15 13:45:58.635 [INFO][4349] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-ls2h6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:45:58.666702 systemd[1]: Started cri-containerd-cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6.scope - libcontainer container cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6. Jan 15 13:45:58.740060 containerd[1513]: time="2025-01-15T13:45:58.736152798Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:58.740060 containerd[1513]: time="2025-01-15T13:45:58.736303437Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:58.740060 containerd[1513]: time="2025-01-15T13:45:58.736325399Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:58.740060 containerd[1513]: time="2025-01-15T13:45:58.736651559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:58.800375 containerd[1513]: time="2025-01-15T13:45:58.800043011Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:58.812384 containerd[1513]: time="2025-01-15T13:45:58.812300700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 15 13:45:58.813065 systemd[1]: Started cri-containerd-61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1.scope - libcontainer container 61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1. Jan 15 13:45:58.819284 containerd[1513]: time="2025-01-15T13:45:58.819224844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8496b9bc76-lh2fq,Uid:9c55259c-6a2f-4fd1-8729-52141d279855,Namespace:calico-system,Attempt:1,} returns sandbox id \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\"" Jan 15 13:45:58.827654 containerd[1513]: time="2025-01-15T13:45:58.827607172Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:58.837887 containerd[1513]: time="2025-01-15T13:45:58.837764907Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:45:58.839535 containerd[1513]: time="2025-01-15T13:45:58.838751553Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 2.724048125s" Jan 15 13:45:58.839535 containerd[1513]: time="2025-01-15T13:45:58.838883796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 15 13:45:58.846583 containerd[1513]: time="2025-01-15T13:45:58.846540240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 15 13:45:58.853171 containerd[1513]: time="2025-01-15T13:45:58.853121051Z" level=info msg="CreateContainer within sandbox \"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 15 13:45:58.884263 containerd[1513]: time="2025-01-15T13:45:58.883850711Z" level=info msg="CreateContainer within sandbox \"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"c9ec79a0a87b637f6139e9322e518d376248d8978840bb6c157fe72d533249c1\"" Jan 15 13:45:58.885922 containerd[1513]: time="2025-01-15T13:45:58.885699814Z" level=info msg="StartContainer for \"c9ec79a0a87b637f6139e9322e518d376248d8978840bb6c157fe72d533249c1\"" Jan 15 13:45:58.891948 containerd[1513]: time="2025-01-15T13:45:58.891906185Z" level=info msg="StopPodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\"" Jan 15 13:45:59.009097 systemd[1]: Started cri-containerd-c9ec79a0a87b637f6139e9322e518d376248d8978840bb6c157fe72d533249c1.scope - libcontainer container c9ec79a0a87b637f6139e9322e518d376248d8978840bb6c157fe72d533249c1. Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.007 [INFO][4509] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.007 [INFO][4509] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" iface="eth0" netns="/var/run/netns/cni-8f3d02df-e43c-5db2-0df4-2ab372f52b6d" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.008 [INFO][4509] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" iface="eth0" netns="/var/run/netns/cni-8f3d02df-e43c-5db2-0df4-2ab372f52b6d" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.012 [INFO][4509] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" iface="eth0" netns="/var/run/netns/cni-8f3d02df-e43c-5db2-0df4-2ab372f52b6d" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.012 [INFO][4509] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.012 [INFO][4509] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.064 [INFO][4538] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.064 [INFO][4538] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.065 [INFO][4538] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.077 [WARNING][4538] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.077 [INFO][4538] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.079 [INFO][4538] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:59.111671 containerd[1513]: 2025-01-15 13:45:59.085 [INFO][4509] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:45:59.117520 containerd[1513]: time="2025-01-15T13:45:59.116375940Z" level=info msg="TearDown network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" successfully" Jan 15 13:45:59.117801 containerd[1513]: time="2025-01-15T13:45:59.117772782Z" level=info msg="StopPodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" returns successfully" Jan 15 13:45:59.121633 containerd[1513]: time="2025-01-15T13:45:59.121233133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2w45s,Uid:299bee91-9829-4be7-9e66-0041031d6397,Namespace:kube-system,Attempt:1,}" Jan 15 13:45:59.129068 containerd[1513]: time="2025-01-15T13:45:59.129012814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-ls2h6,Uid:403d6493-86a8-45cb-bcf7-b66df6eeb925,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1\"" Jan 15 13:45:59.175893 systemd[1]: run-netns-cni\x2d8f3d02df\x2de43c\x2d5db2\x2d0df4\x2d2ab372f52b6d.mount: Deactivated successfully. Jan 15 13:45:59.284018 containerd[1513]: time="2025-01-15T13:45:59.283955099Z" level=info msg="StartContainer for \"c9ec79a0a87b637f6139e9322e518d376248d8978840bb6c157fe72d533249c1\" returns successfully" Jan 15 13:45:59.605229 systemd-networkd[1436]: calib97b0367ad0: Link UP Jan 15 13:45:59.609412 systemd-networkd[1436]: calib97b0367ad0: Gained carrier Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.270 [INFO][4562] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.333 [INFO][4562] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0 coredns-7db6d8ff4d- kube-system 299bee91-9829-4be7-9e66-0041031d6397 835 0 2025-01-15 13:45:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com coredns-7db6d8ff4d-2w45s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib97b0367ad0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.333 [INFO][4562] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.438 [INFO][4601] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" HandleID="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.496 [INFO][4601] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" HandleID="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000360dc0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"coredns-7db6d8ff4d-2w45s", "timestamp":"2025-01-15 13:45:59.437974532 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.496 [INFO][4601] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.497 [INFO][4601] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.497 [INFO][4601] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.512 [INFO][4601] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.521 [INFO][4601] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.538 [INFO][4601] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.540 [INFO][4601] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.553 [INFO][4601] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.553 [INFO][4601] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.556 [INFO][4601] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070 Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.574 [INFO][4601] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.594 [INFO][4601] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.197/26] block=192.168.95.192/26 handle="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.595 [INFO][4601] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.197/26] handle="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.595 [INFO][4601] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:45:59.650055 containerd[1513]: 2025-01-15 13:45:59.595 [INFO][4601] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.197/26] IPv6=[] ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" HandleID="k8s-pod-network.74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.653329 containerd[1513]: 2025-01-15 13:45:59.600 [INFO][4562] cni-plugin/k8s.go 386: Populated endpoint ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"299bee91-9829-4be7-9e66-0041031d6397", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7db6d8ff4d-2w45s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib97b0367ad0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:59.653329 containerd[1513]: 2025-01-15 13:45:59.600 [INFO][4562] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.197/32] ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.653329 containerd[1513]: 2025-01-15 13:45:59.600 [INFO][4562] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib97b0367ad0 ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.653329 containerd[1513]: 2025-01-15 13:45:59.608 [INFO][4562] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.653329 containerd[1513]: 2025-01-15 13:45:59.612 [INFO][4562] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"299bee91-9829-4be7-9e66-0041031d6397", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070", Pod:"coredns-7db6d8ff4d-2w45s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib97b0367ad0", MAC:"52:06:5d:c2:32:98", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:45:59.653329 containerd[1513]: 2025-01-15 13:45:59.646 [INFO][4562] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2w45s" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:45:59.675636 systemd-networkd[1436]: cali874334f92a8: Gained IPv6LL Jan 15 13:45:59.700330 containerd[1513]: time="2025-01-15T13:45:59.700149421Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:45:59.700330 containerd[1513]: time="2025-01-15T13:45:59.700273296Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:45:59.700692 containerd[1513]: time="2025-01-15T13:45:59.700298821Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:59.703523 containerd[1513]: time="2025-01-15T13:45:59.703202322Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:45:59.757819 systemd[1]: Started cri-containerd-74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070.scope - libcontainer container 74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070. Jan 15 13:45:59.856252 containerd[1513]: time="2025-01-15T13:45:59.856093718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2w45s,Uid:299bee91-9829-4be7-9e66-0041031d6397,Namespace:kube-system,Attempt:1,} returns sandbox id \"74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070\"" Jan 15 13:45:59.888417 containerd[1513]: time="2025-01-15T13:45:59.886569709Z" level=info msg="CreateContainer within sandbox \"74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 13:45:59.919610 containerd[1513]: time="2025-01-15T13:45:59.919532140Z" level=info msg="CreateContainer within sandbox \"74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5248782d8be78606b863ae72bea219d3c043262355b3e8cab9b87c69a9cb7c05\"" Jan 15 13:45:59.923598 containerd[1513]: time="2025-01-15T13:45:59.920575841Z" level=info msg="StartContainer for \"5248782d8be78606b863ae72bea219d3c043262355b3e8cab9b87c69a9cb7c05\"" Jan 15 13:45:59.989775 systemd[1]: Started cri-containerd-5248782d8be78606b863ae72bea219d3c043262355b3e8cab9b87c69a9cb7c05.scope - libcontainer container 5248782d8be78606b863ae72bea219d3c043262355b3e8cab9b87c69a9cb7c05. Jan 15 13:46:00.055979 containerd[1513]: time="2025-01-15T13:46:00.055605584Z" level=info msg="StartContainer for \"5248782d8be78606b863ae72bea219d3c043262355b3e8cab9b87c69a9cb7c05\" returns successfully" Jan 15 13:46:00.141488 kernel: bpftool[4724]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 15 13:46:00.251759 systemd-networkd[1436]: calidb919b18e49: Gained IPv6LL Jan 15 13:46:00.297231 kubelet[2711]: I0115 13:46:00.297074 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-2w45s" podStartSLOduration=41.296947196 podStartE2EDuration="41.296947196s" podCreationTimestamp="2025-01-15 13:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:46:00.294055335 +0000 UTC m=+54.546733420" watchObservedRunningTime="2025-01-15 13:46:00.296947196 +0000 UTC m=+54.549625281" Jan 15 13:46:00.803632 systemd-networkd[1436]: vxlan.calico: Link UP Jan 15 13:46:00.803647 systemd-networkd[1436]: vxlan.calico: Gained carrier Jan 15 13:46:00.895257 containerd[1513]: time="2025-01-15T13:46:00.895177207Z" level=info msg="StopPodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\"" Jan 15 13:46:01.019037 systemd-networkd[1436]: calib97b0367ad0: Gained IPv6LL Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.103 [INFO][4799] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.105 [INFO][4799] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" iface="eth0" netns="/var/run/netns/cni-964df3f4-8f0b-8cd3-3329-dad22bb82fa9" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.105 [INFO][4799] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" iface="eth0" netns="/var/run/netns/cni-964df3f4-8f0b-8cd3-3329-dad22bb82fa9" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.105 [INFO][4799] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" iface="eth0" netns="/var/run/netns/cni-964df3f4-8f0b-8cd3-3329-dad22bb82fa9" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.105 [INFO][4799] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.106 [INFO][4799] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.183 [INFO][4805] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.183 [INFO][4805] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.184 [INFO][4805] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.206 [WARNING][4805] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.207 [INFO][4805] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.209 [INFO][4805] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:01.219400 containerd[1513]: 2025-01-15 13:46:01.214 [INFO][4799] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:01.225466 containerd[1513]: time="2025-01-15T13:46:01.224096626Z" level=info msg="TearDown network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" successfully" Jan 15 13:46:01.225466 containerd[1513]: time="2025-01-15T13:46:01.224198608Z" level=info msg="StopPodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" returns successfully" Jan 15 13:46:01.225669 containerd[1513]: time="2025-01-15T13:46:01.225550863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-zgh9j,Uid:9de999de-69d1-4aa2-96d0-0c16e7b716ad,Namespace:calico-apiserver,Attempt:1,}" Jan 15 13:46:01.235191 systemd[1]: run-netns-cni\x2d964df3f4\x2d8f0b\x2d8cd3\x2d3329\x2ddad22bb82fa9.mount: Deactivated successfully. Jan 15 13:46:01.601027 systemd-networkd[1436]: cali0012ba55a4f: Link UP Jan 15 13:46:01.606147 systemd-networkd[1436]: cali0012ba55a4f: Gained carrier Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.363 [INFO][4813] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0 calico-apiserver-54c9c669d7- calico-apiserver 9de999de-69d1-4aa2-96d0-0c16e7b716ad 858 0 2025-01-15 13:45:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54c9c669d7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com calico-apiserver-54c9c669d7-zgh9j eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0012ba55a4f [] []}} ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.363 [INFO][4813] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.479 [INFO][4824] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" HandleID="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.507 [INFO][4824] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" HandleID="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291960), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"calico-apiserver-54c9c669d7-zgh9j", "timestamp":"2025-01-15 13:46:01.479425332 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.507 [INFO][4824] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.508 [INFO][4824] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.508 [INFO][4824] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.515 [INFO][4824] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.531 [INFO][4824] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.549 [INFO][4824] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.553 [INFO][4824] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.557 [INFO][4824] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.557 [INFO][4824] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.561 [INFO][4824] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.572 [INFO][4824] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.588 [INFO][4824] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.198/26] block=192.168.95.192/26 handle="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.588 [INFO][4824] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.198/26] handle="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.588 [INFO][4824] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:01.651330 containerd[1513]: 2025-01-15 13:46:01.588 [INFO][4824] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.198/26] IPv6=[] ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" HandleID="k8s-pod-network.e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.654278 containerd[1513]: 2025-01-15 13:46:01.593 [INFO][4813] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9de999de-69d1-4aa2-96d0-0c16e7b716ad", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-54c9c669d7-zgh9j", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0012ba55a4f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:01.654278 containerd[1513]: 2025-01-15 13:46:01.593 [INFO][4813] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.198/32] ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.654278 containerd[1513]: 2025-01-15 13:46:01.593 [INFO][4813] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0012ba55a4f ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.654278 containerd[1513]: 2025-01-15 13:46:01.608 [INFO][4813] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.654278 containerd[1513]: 2025-01-15 13:46:01.615 [INFO][4813] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9de999de-69d1-4aa2-96d0-0c16e7b716ad", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a", Pod:"calico-apiserver-54c9c669d7-zgh9j", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0012ba55a4f", MAC:"2e:b7:08:bf:65:ca", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:01.654278 containerd[1513]: 2025-01-15 13:46:01.646 [INFO][4813] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a" Namespace="calico-apiserver" Pod="calico-apiserver-54c9c669d7-zgh9j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:01.762264 containerd[1513]: time="2025-01-15T13:46:01.762110708Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:46:01.763932 containerd[1513]: time="2025-01-15T13:46:01.762237754Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:46:01.763932 containerd[1513]: time="2025-01-15T13:46:01.763806142Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:01.764763 containerd[1513]: time="2025-01-15T13:46:01.764648600Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:01.827667 systemd[1]: Started cri-containerd-e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a.scope - libcontainer container e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a. Jan 15 13:46:02.112801 containerd[1513]: time="2025-01-15T13:46:02.110806871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54c9c669d7-zgh9j,Uid:9de999de-69d1-4aa2-96d0-0c16e7b716ad,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a\"" Jan 15 13:46:02.618687 systemd-networkd[1436]: cali0012ba55a4f: Gained IPv6LL Jan 15 13:46:02.737172 containerd[1513]: time="2025-01-15T13:46:02.736203376Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:02.738627 containerd[1513]: time="2025-01-15T13:46:02.738550482Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 15 13:46:02.739699 containerd[1513]: time="2025-01-15T13:46:02.739642302Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:02.743029 containerd[1513]: time="2025-01-15T13:46:02.742903068Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:02.744383 containerd[1513]: time="2025-01-15T13:46:02.744214253Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 3.897625862s" Jan 15 13:46:02.744383 containerd[1513]: time="2025-01-15T13:46:02.744259084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 15 13:46:02.746724 systemd-networkd[1436]: vxlan.calico: Gained IPv6LL Jan 15 13:46:02.751016 containerd[1513]: time="2025-01-15T13:46:02.750967897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 15 13:46:02.779121 containerd[1513]: time="2025-01-15T13:46:02.778922396Z" level=info msg="CreateContainer within sandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 15 13:46:02.812256 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3838153467.mount: Deactivated successfully. Jan 15 13:46:02.823484 containerd[1513]: time="2025-01-15T13:46:02.823164018Z" level=info msg="CreateContainer within sandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\"" Jan 15 13:46:02.825858 containerd[1513]: time="2025-01-15T13:46:02.825748350Z" level=info msg="StartContainer for \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\"" Jan 15 13:46:02.895749 systemd[1]: Started cri-containerd-b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3.scope - libcontainer container b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3. Jan 15 13:46:02.989195 containerd[1513]: time="2025-01-15T13:46:02.989012831Z" level=info msg="StartContainer for \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\" returns successfully" Jan 15 13:46:03.310705 kubelet[2711]: I0115 13:46:03.310602 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8496b9bc76-lh2fq" podStartSLOduration=30.393391122 podStartE2EDuration="34.310556754s" podCreationTimestamp="2025-01-15 13:45:29 +0000 UTC" firstStartedPulling="2025-01-15 13:45:58.832361841 +0000 UTC m=+53.085039926" lastFinishedPulling="2025-01-15 13:46:02.749527486 +0000 UTC m=+57.002205558" observedRunningTime="2025-01-15 13:46:03.310375259 +0000 UTC m=+57.563053353" watchObservedRunningTime="2025-01-15 13:46:03.310556754 +0000 UTC m=+57.563234844" Jan 15 13:46:05.931141 containerd[1513]: time="2025-01-15T13:46:05.931026421Z" level=info msg="StopPodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\"" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.119 [WARNING][5002] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9dd88286-18b7-4e6e-a5d3-8c847dab96ba", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c", Pod:"csi-node-driver-zz4wr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali48ccfec3b47", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.121 [INFO][5002] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.121 [INFO][5002] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" iface="eth0" netns="" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.121 [INFO][5002] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.121 [INFO][5002] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.176 [INFO][5008] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.177 [INFO][5008] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.177 [INFO][5008] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.192 [WARNING][5008] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.192 [INFO][5008] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.200 [INFO][5008] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:06.208681 containerd[1513]: 2025-01-15 13:46:06.204 [INFO][5002] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.208681 containerd[1513]: time="2025-01-15T13:46:06.207938669Z" level=info msg="TearDown network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" successfully" Jan 15 13:46:06.208681 containerd[1513]: time="2025-01-15T13:46:06.207985828Z" level=info msg="StopPodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" returns successfully" Jan 15 13:46:06.211519 containerd[1513]: time="2025-01-15T13:46:06.209433671Z" level=info msg="RemovePodSandbox for \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\"" Jan 15 13:46:06.211519 containerd[1513]: time="2025-01-15T13:46:06.209507143Z" level=info msg="Forcibly stopping sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\"" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.306 [WARNING][5027] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9dd88286-18b7-4e6e-a5d3-8c847dab96ba", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c", Pod:"csi-node-driver-zz4wr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali48ccfec3b47", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.308 [INFO][5027] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.308 [INFO][5027] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" iface="eth0" netns="" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.308 [INFO][5027] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.308 [INFO][5027] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.399 [INFO][5033] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.400 [INFO][5033] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.400 [INFO][5033] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.413 [WARNING][5033] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.413 [INFO][5033] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" HandleID="k8s-pod-network.480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Workload="srv--6yg2e.gb1.brightbox.com-k8s-csi--node--driver--zz4wr-eth0" Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.416 [INFO][5033] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:06.421715 containerd[1513]: 2025-01-15 13:46:06.418 [INFO][5027] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606" Jan 15 13:46:06.421715 containerd[1513]: time="2025-01-15T13:46:06.421409120Z" level=info msg="TearDown network for sandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" successfully" Jan 15 13:46:06.437799 containerd[1513]: time="2025-01-15T13:46:06.437481472Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:46:06.437799 containerd[1513]: time="2025-01-15T13:46:06.437652199Z" level=info msg="RemovePodSandbox \"480c93dcf9915d44209898ac1334753639ffd578e39f0b2371c625898b52c606\" returns successfully" Jan 15 13:46:06.440535 containerd[1513]: time="2025-01-15T13:46:06.440499280Z" level=info msg="StopPodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\"" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.525 [WARNING][5052] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9de999de-69d1-4aa2-96d0-0c16e7b716ad", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a", Pod:"calico-apiserver-54c9c669d7-zgh9j", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0012ba55a4f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.525 [INFO][5052] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.525 [INFO][5052] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" iface="eth0" netns="" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.525 [INFO][5052] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.525 [INFO][5052] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.580 [INFO][5058] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.580 [INFO][5058] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.580 [INFO][5058] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.597 [WARNING][5058] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.597 [INFO][5058] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.601 [INFO][5058] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:06.607792 containerd[1513]: 2025-01-15 13:46:06.603 [INFO][5052] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.608993 containerd[1513]: time="2025-01-15T13:46:06.607849100Z" level=info msg="TearDown network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" successfully" Jan 15 13:46:06.608993 containerd[1513]: time="2025-01-15T13:46:06.607884188Z" level=info msg="StopPodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" returns successfully" Jan 15 13:46:06.610470 containerd[1513]: time="2025-01-15T13:46:06.609951135Z" level=info msg="RemovePodSandbox for \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\"" Jan 15 13:46:06.610470 containerd[1513]: time="2025-01-15T13:46:06.610001374Z" level=info msg="Forcibly stopping sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\"" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.691 [WARNING][5076] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"9de999de-69d1-4aa2-96d0-0c16e7b716ad", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a", Pod:"calico-apiserver-54c9c669d7-zgh9j", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0012ba55a4f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.691 [INFO][5076] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.691 [INFO][5076] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" iface="eth0" netns="" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.692 [INFO][5076] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.692 [INFO][5076] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.735 [INFO][5082] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.735 [INFO][5082] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.735 [INFO][5082] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.747 [WARNING][5082] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.747 [INFO][5082] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" HandleID="k8s-pod-network.b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--zgh9j-eth0" Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.751 [INFO][5082] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:06.755658 containerd[1513]: 2025-01-15 13:46:06.753 [INFO][5076] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853" Jan 15 13:46:06.756947 containerd[1513]: time="2025-01-15T13:46:06.755720632Z" level=info msg="TearDown network for sandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" successfully" Jan 15 13:46:06.763146 containerd[1513]: time="2025-01-15T13:46:06.762468636Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:46:06.763146 containerd[1513]: time="2025-01-15T13:46:06.762563614Z" level=info msg="RemovePodSandbox \"b397dd8be86e369552ecfd60d08a3db733be4f3aae527f235268d8d40df1d853\" returns successfully" Jan 15 13:46:06.764023 containerd[1513]: time="2025-01-15T13:46:06.763667260Z" level=info msg="StopPodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\"" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.884 [WARNING][5101] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"299bee91-9829-4be7-9e66-0041031d6397", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070", Pod:"coredns-7db6d8ff4d-2w45s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib97b0367ad0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.885 [INFO][5101] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.885 [INFO][5101] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" iface="eth0" netns="" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.885 [INFO][5101] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.885 [INFO][5101] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.937 [INFO][5108] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.937 [INFO][5108] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.937 [INFO][5108] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.958 [WARNING][5108] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.959 [INFO][5108] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.965 [INFO][5108] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:06.975141 containerd[1513]: 2025-01-15 13:46:06.971 [INFO][5101] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:06.979342 containerd[1513]: time="2025-01-15T13:46:06.975209249Z" level=info msg="TearDown network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" successfully" Jan 15 13:46:06.979342 containerd[1513]: time="2025-01-15T13:46:06.975246424Z" level=info msg="StopPodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" returns successfully" Jan 15 13:46:06.979342 containerd[1513]: time="2025-01-15T13:46:06.976237624Z" level=info msg="RemovePodSandbox for \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\"" Jan 15 13:46:06.979342 containerd[1513]: time="2025-01-15T13:46:06.976282496Z" level=info msg="Forcibly stopping sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\"" Jan 15 13:46:07.072567 containerd[1513]: time="2025-01-15T13:46:07.071991171Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:07.073327 containerd[1513]: time="2025-01-15T13:46:07.073268101Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 15 13:46:07.075936 containerd[1513]: time="2025-01-15T13:46:07.075802826Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:07.083745 containerd[1513]: time="2025-01-15T13:46:07.083595593Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:07.085087 containerd[1513]: time="2025-01-15T13:46:07.084934830Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 4.333429593s" Jan 15 13:46:07.085087 containerd[1513]: time="2025-01-15T13:46:07.085023860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 15 13:46:07.090355 containerd[1513]: time="2025-01-15T13:46:07.090192146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 15 13:46:07.092896 containerd[1513]: time="2025-01-15T13:46:07.092735570Z" level=info msg="CreateContainer within sandbox \"61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 15 13:46:07.122431 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount946955687.mount: Deactivated successfully. Jan 15 13:46:07.127568 containerd[1513]: time="2025-01-15T13:46:07.127475399Z" level=info msg="CreateContainer within sandbox \"61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5fcb2b8e9ba4feca175b6b9e49bd8b14c5f49795e1b7ae53180f3e35151f2e3b\"" Jan 15 13:46:07.130400 containerd[1513]: time="2025-01-15T13:46:07.129657125Z" level=info msg="StartContainer for \"5fcb2b8e9ba4feca175b6b9e49bd8b14c5f49795e1b7ae53180f3e35151f2e3b\"" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.074 [WARNING][5130] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"299bee91-9829-4be7-9e66-0041031d6397", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"74bba797b4b4501f66202d077f9e232f5e6e6d7d175258f23a30c22d09b54070", Pod:"coredns-7db6d8ff4d-2w45s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib97b0367ad0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.074 [INFO][5130] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.074 [INFO][5130] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" iface="eth0" netns="" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.074 [INFO][5130] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.075 [INFO][5130] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.156 [INFO][5142] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.156 [INFO][5142] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.156 [INFO][5142] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.173 [WARNING][5142] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.173 [INFO][5142] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" HandleID="k8s-pod-network.3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--2w45s-eth0" Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.176 [INFO][5142] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:07.193581 containerd[1513]: 2025-01-15 13:46:07.183 [INFO][5130] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea" Jan 15 13:46:07.196084 containerd[1513]: time="2025-01-15T13:46:07.195332107Z" level=info msg="TearDown network for sandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" successfully" Jan 15 13:46:07.212981 containerd[1513]: time="2025-01-15T13:46:07.212907793Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:46:07.213334 containerd[1513]: time="2025-01-15T13:46:07.213227408Z" level=info msg="RemovePodSandbox \"3cafa2834e043f038ca651b699cee81752b43ea69d0fe0b564a1dde6f5980cea\" returns successfully" Jan 15 13:46:07.214038 containerd[1513]: time="2025-01-15T13:46:07.213985937Z" level=info msg="StopPodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\"" Jan 15 13:46:07.246358 systemd[1]: run-containerd-runc-k8s.io-5fcb2b8e9ba4feca175b6b9e49bd8b14c5f49795e1b7ae53180f3e35151f2e3b-runc.0e8nLN.mount: Deactivated successfully. Jan 15 13:46:07.259699 systemd[1]: Started cri-containerd-5fcb2b8e9ba4feca175b6b9e49bd8b14c5f49795e1b7ae53180f3e35151f2e3b.scope - libcontainer container 5fcb2b8e9ba4feca175b6b9e49bd8b14c5f49795e1b7ae53180f3e35151f2e3b. Jan 15 13:46:07.365771 containerd[1513]: time="2025-01-15T13:46:07.365716222Z" level=info msg="StartContainer for \"5fcb2b8e9ba4feca175b6b9e49bd8b14c5f49795e1b7ae53180f3e35151f2e3b\" returns successfully" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.307 [WARNING][5182] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0", GenerateName:"calico-kube-controllers-8496b9bc76-", Namespace:"calico-system", SelfLink:"", UID:"9c55259c-6a2f-4fd1-8729-52141d279855", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8496b9bc76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6", Pod:"calico-kube-controllers-8496b9bc76-lh2fq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali874334f92a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.308 [INFO][5182] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.308 [INFO][5182] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" iface="eth0" netns="" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.308 [INFO][5182] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.308 [INFO][5182] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.379 [INFO][5195] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.379 [INFO][5195] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.379 [INFO][5195] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.390 [WARNING][5195] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.390 [INFO][5195] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.394 [INFO][5195] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:07.399559 containerd[1513]: 2025-01-15 13:46:07.397 [INFO][5182] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.402134 containerd[1513]: time="2025-01-15T13:46:07.399639977Z" level=info msg="TearDown network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" successfully" Jan 15 13:46:07.402134 containerd[1513]: time="2025-01-15T13:46:07.399677164Z" level=info msg="StopPodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" returns successfully" Jan 15 13:46:07.402134 containerd[1513]: time="2025-01-15T13:46:07.400883304Z" level=info msg="RemovePodSandbox for \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\"" Jan 15 13:46:07.402134 containerd[1513]: time="2025-01-15T13:46:07.400927068Z" level=info msg="Forcibly stopping sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\"" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.474 [WARNING][5223] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0", GenerateName:"calico-kube-controllers-8496b9bc76-", Namespace:"calico-system", SelfLink:"", UID:"9c55259c-6a2f-4fd1-8729-52141d279855", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8496b9bc76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6", Pod:"calico-kube-controllers-8496b9bc76-lh2fq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali874334f92a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.474 [INFO][5223] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.474 [INFO][5223] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" iface="eth0" netns="" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.474 [INFO][5223] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.474 [INFO][5223] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.509 [INFO][5232] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.509 [INFO][5232] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.510 [INFO][5232] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.520 [WARNING][5232] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.520 [INFO][5232] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" HandleID="k8s-pod-network.0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.522 [INFO][5232] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:07.526960 containerd[1513]: 2025-01-15 13:46:07.524 [INFO][5223] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d" Jan 15 13:46:07.530368 containerd[1513]: time="2025-01-15T13:46:07.528578947Z" level=info msg="TearDown network for sandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" successfully" Jan 15 13:46:07.533914 containerd[1513]: time="2025-01-15T13:46:07.533878868Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:46:07.533988 containerd[1513]: time="2025-01-15T13:46:07.533967572Z" level=info msg="RemovePodSandbox \"0f45034cbc873b6269341cf19a92017ba778bfe06506d7eaecb52359128cc67d\" returns successfully" Jan 15 13:46:07.535039 containerd[1513]: time="2025-01-15T13:46:07.534990595Z" level=info msg="StopPodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\"" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.590 [WARNING][5250] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27", Pod:"coredns-7db6d8ff4d-qstwj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93d08021791", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.592 [INFO][5250] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.592 [INFO][5250] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" iface="eth0" netns="" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.592 [INFO][5250] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.592 [INFO][5250] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.636 [INFO][5256] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.636 [INFO][5256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.636 [INFO][5256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.646 [WARNING][5256] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.646 [INFO][5256] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.648 [INFO][5256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:07.653395 containerd[1513]: 2025-01-15 13:46:07.650 [INFO][5250] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.655089 containerd[1513]: time="2025-01-15T13:46:07.654223766Z" level=info msg="TearDown network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" successfully" Jan 15 13:46:07.655089 containerd[1513]: time="2025-01-15T13:46:07.654261700Z" level=info msg="StopPodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" returns successfully" Jan 15 13:46:07.655856 containerd[1513]: time="2025-01-15T13:46:07.655794646Z" level=info msg="RemovePodSandbox for \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\"" Jan 15 13:46:07.655967 containerd[1513]: time="2025-01-15T13:46:07.655871466Z" level=info msg="Forcibly stopping sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\"" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.716 [WARNING][5276] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"2cd6f230-e12d-4e00-8ac3-2b9a6443dd5a", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"b38faef506bb51ffb71932947a71d54206003b217b2267f992349048f9837b27", Pod:"coredns-7db6d8ff4d-qstwj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93d08021791", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.717 [INFO][5276] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.717 [INFO][5276] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" iface="eth0" netns="" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.717 [INFO][5276] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.717 [INFO][5276] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.759 [INFO][5282] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.760 [INFO][5282] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.760 [INFO][5282] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.768 [WARNING][5282] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.768 [INFO][5282] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" HandleID="k8s-pod-network.e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Workload="srv--6yg2e.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--qstwj-eth0" Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.770 [INFO][5282] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:07.775585 containerd[1513]: 2025-01-15 13:46:07.772 [INFO][5276] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7" Jan 15 13:46:07.776790 containerd[1513]: time="2025-01-15T13:46:07.776229576Z" level=info msg="TearDown network for sandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" successfully" Jan 15 13:46:07.818064 containerd[1513]: time="2025-01-15T13:46:07.817838950Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:46:07.818318 containerd[1513]: time="2025-01-15T13:46:07.818061414Z" level=info msg="RemovePodSandbox \"e40a504e53fcc918aff5390763b8c4a1ba1c884a11d2e2c3d1e6bfec663c66e7\" returns successfully" Jan 15 13:46:07.820646 containerd[1513]: time="2025-01-15T13:46:07.820084461Z" level=info msg="StopPodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\"" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.882 [WARNING][5300] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"403d6493-86a8-45cb-bcf7-b66df6eeb925", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1", Pod:"calico-apiserver-54c9c669d7-ls2h6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb919b18e49", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.882 [INFO][5300] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.882 [INFO][5300] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" iface="eth0" netns="" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.882 [INFO][5300] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.882 [INFO][5300] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.927 [INFO][5307] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.928 [INFO][5307] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.928 [INFO][5307] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.938 [WARNING][5307] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.938 [INFO][5307] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.940 [INFO][5307] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:07.945028 containerd[1513]: 2025-01-15 13:46:07.942 [INFO][5300] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:07.946724 containerd[1513]: time="2025-01-15T13:46:07.945580357Z" level=info msg="TearDown network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" successfully" Jan 15 13:46:07.946724 containerd[1513]: time="2025-01-15T13:46:07.945617616Z" level=info msg="StopPodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" returns successfully" Jan 15 13:46:07.946724 containerd[1513]: time="2025-01-15T13:46:07.946325486Z" level=info msg="RemovePodSandbox for \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\"" Jan 15 13:46:07.946724 containerd[1513]: time="2025-01-15T13:46:07.946365509Z" level=info msg="Forcibly stopping sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\"" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.018 [WARNING][5326] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0", GenerateName:"calico-apiserver-54c9c669d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"403d6493-86a8-45cb-bcf7-b66df6eeb925", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 45, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54c9c669d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"61bf797761497a080f43d2ac726aedcedaf07ec5063e8008ce981976f3089aa1", Pod:"calico-apiserver-54c9c669d7-ls2h6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb919b18e49", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.019 [INFO][5326] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.019 [INFO][5326] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" iface="eth0" netns="" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.019 [INFO][5326] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.019 [INFO][5326] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.065 [INFO][5333] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.065 [INFO][5333] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.065 [INFO][5333] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.078 [WARNING][5333] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.078 [INFO][5333] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" HandleID="k8s-pod-network.05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--apiserver--54c9c669d7--ls2h6-eth0" Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.080 [INFO][5333] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:08.086777 containerd[1513]: 2025-01-15 13:46:08.082 [INFO][5326] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26" Jan 15 13:46:08.086777 containerd[1513]: time="2025-01-15T13:46:08.085262824Z" level=info msg="TearDown network for sandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" successfully" Jan 15 13:46:08.093109 containerd[1513]: time="2025-01-15T13:46:08.093067054Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:46:08.093210 containerd[1513]: time="2025-01-15T13:46:08.093152216Z" level=info msg="RemovePodSandbox \"05e78877e9e4191b5dacf259f42c904959e4d4cbe8d3f3529091131b2d9a8a26\" returns successfully" Jan 15 13:46:09.360522 kubelet[2711]: I0115 13:46:09.360038 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 13:46:09.416718 containerd[1513]: time="2025-01-15T13:46:09.416019410Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:09.419467 containerd[1513]: time="2025-01-15T13:46:09.418736744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 15 13:46:09.424468 containerd[1513]: time="2025-01-15T13:46:09.423098446Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:09.427129 containerd[1513]: time="2025-01-15T13:46:09.427088125Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:09.430810 containerd[1513]: time="2025-01-15T13:46:09.430744440Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.340507199s" Jan 15 13:46:09.430919 containerd[1513]: time="2025-01-15T13:46:09.430814927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 15 13:46:09.434602 containerd[1513]: time="2025-01-15T13:46:09.434564915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 15 13:46:09.436007 containerd[1513]: time="2025-01-15T13:46:09.435974346Z" level=info msg="CreateContainer within sandbox \"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 15 13:46:09.459755 containerd[1513]: time="2025-01-15T13:46:09.459695899Z" level=info msg="CreateContainer within sandbox \"dcb2a1430400cef375549be029128363fc9b282dd98e9a0e14093601d55d401c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"61c8df477a76000daf3e4e14c7c69c34ffa1d37aa53ceb2a3c6910ecac87a6bb\"" Jan 15 13:46:09.461839 containerd[1513]: time="2025-01-15T13:46:09.461562837Z" level=info msg="StartContainer for \"61c8df477a76000daf3e4e14c7c69c34ffa1d37aa53ceb2a3c6910ecac87a6bb\"" Jan 15 13:46:09.536663 systemd[1]: Started cri-containerd-61c8df477a76000daf3e4e14c7c69c34ffa1d37aa53ceb2a3c6910ecac87a6bb.scope - libcontainer container 61c8df477a76000daf3e4e14c7c69c34ffa1d37aa53ceb2a3c6910ecac87a6bb. Jan 15 13:46:09.594833 containerd[1513]: time="2025-01-15T13:46:09.594752113Z" level=info msg="StartContainer for \"61c8df477a76000daf3e4e14c7c69c34ffa1d37aa53ceb2a3c6910ecac87a6bb\" returns successfully" Jan 15 13:46:09.815633 containerd[1513]: time="2025-01-15T13:46:09.815550907Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 13:46:09.816571 containerd[1513]: time="2025-01-15T13:46:09.816479688Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 15 13:46:09.820541 containerd[1513]: time="2025-01-15T13:46:09.820357313Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 385.745043ms" Jan 15 13:46:09.820541 containerd[1513]: time="2025-01-15T13:46:09.820463545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 15 13:46:09.826855 containerd[1513]: time="2025-01-15T13:46:09.826789478Z" level=info msg="CreateContainer within sandbox \"e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 15 13:46:09.851252 containerd[1513]: time="2025-01-15T13:46:09.851116834Z" level=info msg="CreateContainer within sandbox \"e5daeae8a93a0485773b9a8ad42861083dd3aadd0e05ca25bbcccc6b898bda9a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"670caeb8f56c6ca061fe00910093b549a62ff463dee9572e82a83b71abf033f9\"" Jan 15 13:46:09.852771 containerd[1513]: time="2025-01-15T13:46:09.852734966Z" level=info msg="StartContainer for \"670caeb8f56c6ca061fe00910093b549a62ff463dee9572e82a83b71abf033f9\"" Jan 15 13:46:09.891754 systemd[1]: Started cri-containerd-670caeb8f56c6ca061fe00910093b549a62ff463dee9572e82a83b71abf033f9.scope - libcontainer container 670caeb8f56c6ca061fe00910093b549a62ff463dee9572e82a83b71abf033f9. Jan 15 13:46:09.955977 containerd[1513]: time="2025-01-15T13:46:09.955786806Z" level=info msg="StartContainer for \"670caeb8f56c6ca061fe00910093b549a62ff463dee9572e82a83b71abf033f9\" returns successfully" Jan 15 13:46:10.284744 kubelet[2711]: I0115 13:46:10.284672 2711 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 15 13:46:10.284945 kubelet[2711]: I0115 13:46:10.284765 2711 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 15 13:46:10.411482 kubelet[2711]: I0115 13:46:10.411378 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zz4wr" podStartSLOduration=29.090610623 podStartE2EDuration="42.411347313s" podCreationTimestamp="2025-01-15 13:45:28 +0000 UTC" firstStartedPulling="2025-01-15 13:45:56.112929859 +0000 UTC m=+50.365607932" lastFinishedPulling="2025-01-15 13:46:09.43366655 +0000 UTC m=+63.686344622" observedRunningTime="2025-01-15 13:46:10.408413486 +0000 UTC m=+64.661091577" watchObservedRunningTime="2025-01-15 13:46:10.411347313 +0000 UTC m=+64.664025397" Jan 15 13:46:10.412315 kubelet[2711]: I0115 13:46:10.411717 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-54c9c669d7-ls2h6" podStartSLOduration=33.469209915 podStartE2EDuration="41.411708775s" podCreationTimestamp="2025-01-15 13:45:29 +0000 UTC" firstStartedPulling="2025-01-15 13:45:59.146616435 +0000 UTC m=+53.399294528" lastFinishedPulling="2025-01-15 13:46:07.089115308 +0000 UTC m=+61.341793388" observedRunningTime="2025-01-15 13:46:08.392246973 +0000 UTC m=+62.644925062" watchObservedRunningTime="2025-01-15 13:46:10.411708775 +0000 UTC m=+64.664386861" Jan 15 13:46:11.399396 kubelet[2711]: I0115 13:46:11.399341 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 13:46:14.492285 kubelet[2711]: I0115 13:46:14.492223 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 13:46:14.552724 kubelet[2711]: I0115 13:46:14.551725 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-54c9c669d7-zgh9j" podStartSLOduration=37.847867533 podStartE2EDuration="45.551702267s" podCreationTimestamp="2025-01-15 13:45:29 +0000 UTC" firstStartedPulling="2025-01-15 13:46:02.117597703 +0000 UTC m=+56.370275776" lastFinishedPulling="2025-01-15 13:46:09.821432425 +0000 UTC m=+64.074110510" observedRunningTime="2025-01-15 13:46:10.440619478 +0000 UTC m=+64.693297571" watchObservedRunningTime="2025-01-15 13:46:14.551702267 +0000 UTC m=+68.804380352" Jan 15 13:46:21.578975 kubelet[2711]: I0115 13:46:21.578402 2711 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 13:46:29.798583 containerd[1513]: time="2025-01-15T13:46:29.798488655Z" level=info msg="StopContainer for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" with timeout 300 (s)" Jan 15 13:46:29.804468 containerd[1513]: time="2025-01-15T13:46:29.802990566Z" level=info msg="Stop container \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" with signal terminated" Jan 15 13:46:30.155194 containerd[1513]: time="2025-01-15T13:46:30.154757268Z" level=info msg="StopContainer for \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\" with timeout 30 (s)" Jan 15 13:46:30.155618 containerd[1513]: time="2025-01-15T13:46:30.155515363Z" level=info msg="Stop container \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\" with signal terminated" Jan 15 13:46:30.200850 systemd[1]: cri-containerd-b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3.scope: Deactivated successfully. Jan 15 13:46:30.259336 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3-rootfs.mount: Deactivated successfully. Jan 15 13:46:30.303157 containerd[1513]: time="2025-01-15T13:46:30.268578809Z" level=info msg="shim disconnected" id=b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3 namespace=k8s.io Jan 15 13:46:30.318119 containerd[1513]: time="2025-01-15T13:46:30.317738090Z" level=warning msg="cleaning up after shim disconnected" id=b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3 namespace=k8s.io Jan 15 13:46:30.318119 containerd[1513]: time="2025-01-15T13:46:30.317799522Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:30.391930 containerd[1513]: time="2025-01-15T13:46:30.391806858Z" level=warning msg="cleanup warnings time=\"2025-01-15T13:46:30Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 15 13:46:30.398374 containerd[1513]: time="2025-01-15T13:46:30.398323815Z" level=info msg="StopContainer for \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\" returns successfully" Jan 15 13:46:30.399488 containerd[1513]: time="2025-01-15T13:46:30.399205033Z" level=info msg="StopPodSandbox for \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\"" Jan 15 13:46:30.412515 containerd[1513]: time="2025-01-15T13:46:30.407278400Z" level=info msg="Container to stop \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 15 13:46:30.416072 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6-shm.mount: Deactivated successfully. Jan 15 13:46:30.424477 containerd[1513]: time="2025-01-15T13:46:30.424315805Z" level=info msg="StopContainer for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" with timeout 5 (s)" Jan 15 13:46:30.426226 containerd[1513]: time="2025-01-15T13:46:30.426104242Z" level=info msg="Stop container \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" with signal terminated" Jan 15 13:46:30.443751 systemd[1]: cri-containerd-cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6.scope: Deactivated successfully. Jan 15 13:46:30.494924 systemd[1]: cri-containerd-0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e.scope: Deactivated successfully. Jan 15 13:46:30.495256 systemd[1]: cri-containerd-0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e.scope: Consumed 3.582s CPU time. Jan 15 13:46:30.520052 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6-rootfs.mount: Deactivated successfully. Jan 15 13:46:30.525227 containerd[1513]: time="2025-01-15T13:46:30.524844743Z" level=info msg="shim disconnected" id=cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6 namespace=k8s.io Jan 15 13:46:30.525227 containerd[1513]: time="2025-01-15T13:46:30.524912544Z" level=warning msg="cleaning up after shim disconnected" id=cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6 namespace=k8s.io Jan 15 13:46:30.525227 containerd[1513]: time="2025-01-15T13:46:30.524929157Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:30.561100 containerd[1513]: time="2025-01-15T13:46:30.560504579Z" level=warning msg="cleanup warnings time=\"2025-01-15T13:46:30Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 15 13:46:30.587065 containerd[1513]: time="2025-01-15T13:46:30.586780171Z" level=info msg="shim disconnected" id=0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e namespace=k8s.io Jan 15 13:46:30.587065 containerd[1513]: time="2025-01-15T13:46:30.586914312Z" level=warning msg="cleaning up after shim disconnected" id=0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e namespace=k8s.io Jan 15 13:46:30.587065 containerd[1513]: time="2025-01-15T13:46:30.586938405Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:30.591397 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e-rootfs.mount: Deactivated successfully. Jan 15 13:46:30.647117 containerd[1513]: time="2025-01-15T13:46:30.646580336Z" level=info msg="StopContainer for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" returns successfully" Jan 15 13:46:30.647751 containerd[1513]: time="2025-01-15T13:46:30.647712281Z" level=info msg="StopPodSandbox for \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\"" Jan 15 13:46:30.647827 containerd[1513]: time="2025-01-15T13:46:30.647770900Z" level=info msg="Container to stop \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 15 13:46:30.647827 containerd[1513]: time="2025-01-15T13:46:30.647793720Z" level=info msg="Container to stop \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 15 13:46:30.648733 containerd[1513]: time="2025-01-15T13:46:30.648676240Z" level=info msg="Container to stop \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 15 13:46:30.655092 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19-shm.mount: Deactivated successfully. Jan 15 13:46:30.675982 systemd[1]: cri-containerd-f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19.scope: Deactivated successfully. Jan 15 13:46:30.736418 containerd[1513]: time="2025-01-15T13:46:30.736137151Z" level=info msg="shim disconnected" id=f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19 namespace=k8s.io Jan 15 13:46:30.736418 containerd[1513]: time="2025-01-15T13:46:30.736208956Z" level=warning msg="cleaning up after shim disconnected" id=f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19 namespace=k8s.io Jan 15 13:46:30.736418 containerd[1513]: time="2025-01-15T13:46:30.736225887Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:30.789880 containerd[1513]: time="2025-01-15T13:46:30.789812584Z" level=info msg="TearDown network for sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" successfully" Jan 15 13:46:30.789880 containerd[1513]: time="2025-01-15T13:46:30.789860669Z" level=info msg="StopPodSandbox for \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" returns successfully" Jan 15 13:46:30.816917 systemd-networkd[1436]: cali874334f92a8: Link DOWN Jan 15 13:46:30.816931 systemd-networkd[1436]: cali874334f92a8: Lost carrier Jan 15 13:46:30.901584 kubelet[2711]: I0115 13:46:30.894619 2711 topology_manager.go:215] "Topology Admit Handler" podUID="9c53573f-53e2-42a1-a19e-e6752621ccbe" podNamespace="calico-system" podName="calico-node-np8zb" Jan 15 13:46:30.908805 kubelet[2711]: E0115 13:46:30.908548 2711 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="c87ee128-b80e-461f-b0cc-3aafd8d5be53" containerName="flexvol-driver" Jan 15 13:46:30.908805 kubelet[2711]: E0115 13:46:30.908664 2711 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="c87ee128-b80e-461f-b0cc-3aafd8d5be53" containerName="install-cni" Jan 15 13:46:30.908805 kubelet[2711]: E0115 13:46:30.908685 2711 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="c87ee128-b80e-461f-b0cc-3aafd8d5be53" containerName="calico-node" Jan 15 13:46:30.920040 kubelet[2711]: I0115 13:46:30.919810 2711 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87ee128-b80e-461f-b0cc-3aafd8d5be53" containerName="calico-node" Jan 15 13:46:30.942217 systemd[1]: Created slice kubepods-besteffort-pod9c53573f_53e2_42a1_a19e_e6752621ccbe.slice - libcontainer container kubepods-besteffort-pod9c53573f_53e2_42a1_a19e_e6752621ccbe.slice. Jan 15 13:46:30.971468 kubelet[2711]: I0115 13:46:30.971383 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-lib-modules\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971468 kubelet[2711]: I0115 13:46:30.971469 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-flexvol-driver-host\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971721 kubelet[2711]: I0115 13:46:30.971506 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-run-calico\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971721 kubelet[2711]: I0115 13:46:30.971539 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg5zz\" (UniqueName: \"kubernetes.io/projected/c87ee128-b80e-461f-b0cc-3aafd8d5be53-kube-api-access-tg5zz\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971721 kubelet[2711]: I0115 13:46:30.971570 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-xtables-lock\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971721 kubelet[2711]: I0115 13:46:30.971598 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-log-dir\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971721 kubelet[2711]: I0115 13:46:30.971625 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c87ee128-b80e-461f-b0cc-3aafd8d5be53-tigera-ca-bundle\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.971721 kubelet[2711]: I0115 13:46:30.971658 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-policysync\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.972032 kubelet[2711]: I0115 13:46:30.971685 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c87ee128-b80e-461f-b0cc-3aafd8d5be53-node-certs\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.972032 kubelet[2711]: I0115 13:46:30.971734 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-lib-calico\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.972032 kubelet[2711]: I0115 13:46:30.971757 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-net-dir\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.972032 kubelet[2711]: I0115 13:46:30.971786 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-bin-dir\") pod \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\" (UID: \"c87ee128-b80e-461f-b0cc-3aafd8d5be53\") " Jan 15 13:46:30.972032 kubelet[2711]: I0115 13:46:30.971906 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.972032 kubelet[2711]: I0115 13:46:30.971985 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.974380 kubelet[2711]: I0115 13:46:30.972050 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.974380 kubelet[2711]: I0115 13:46:30.972095 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.974380 kubelet[2711]: I0115 13:46:30.972213 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.990721 kubelet[2711]: I0115 13:46:30.990200 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-policysync" (OuterVolumeSpecName: "policysync") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.992317 kubelet[2711]: I0115 13:46:30.992260 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.995066 kubelet[2711]: I0115 13:46:30.994708 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:30.995066 kubelet[2711]: I0115 13:46:30.994771 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 15 13:46:31.017592 kubelet[2711]: I0115 13:46:31.014995 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c87ee128-b80e-461f-b0cc-3aafd8d5be53-node-certs" (OuterVolumeSpecName: "node-certs") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.812 [INFO][5647] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.813 [INFO][5647] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" iface="eth0" netns="/var/run/netns/cni-34e3dd84-9997-afc5-023a-88a2a05eb9f3" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.814 [INFO][5647] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" iface="eth0" netns="/var/run/netns/cni-34e3dd84-9997-afc5-023a-88a2a05eb9f3" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.825 [INFO][5647] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" after=11.585303ms iface="eth0" netns="/var/run/netns/cni-34e3dd84-9997-afc5-023a-88a2a05eb9f3" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.825 [INFO][5647] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.825 [INFO][5647] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.907 [INFO][5684] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.911 [INFO][5684] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.911 [INFO][5684] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.986 [INFO][5684] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.986 [INFO][5684] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:30.994 [INFO][5684] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:31.019878 containerd[1513]: 2025-01-15 13:46:31.005 [INFO][5647] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:46:31.019878 containerd[1513]: time="2025-01-15T13:46:31.019272400Z" level=info msg="TearDown network for sandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" successfully" Jan 15 13:46:31.019878 containerd[1513]: time="2025-01-15T13:46:31.019310163Z" level=info msg="StopPodSandbox for \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" returns successfully" Jan 15 13:46:31.028709 kubelet[2711]: I0115 13:46:31.028622 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87ee128-b80e-461f-b0cc-3aafd8d5be53-kube-api-access-tg5zz" (OuterVolumeSpecName: "kube-api-access-tg5zz") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "kube-api-access-tg5zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 15 13:46:31.031781 kubelet[2711]: I0115 13:46:31.031334 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87ee128-b80e-461f-b0cc-3aafd8d5be53-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "c87ee128-b80e-461f-b0cc-3aafd8d5be53" (UID: "c87ee128-b80e-461f-b0cc-3aafd8d5be53"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 15 13:46:31.072926 kubelet[2711]: I0115 13:46:31.072871 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-lib-modules\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.072926 kubelet[2711]: I0115 13:46:31.072934 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c53573f-53e2-42a1-a19e-e6752621ccbe-tigera-ca-bundle\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.073204 kubelet[2711]: I0115 13:46:31.072987 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-var-lib-calico\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.073204 kubelet[2711]: I0115 13:46:31.073019 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-policysync\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.073204 kubelet[2711]: I0115 13:46:31.073057 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-var-run-calico\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.073204 kubelet[2711]: I0115 13:46:31.073091 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbfts\" (UniqueName: \"kubernetes.io/projected/9c53573f-53e2-42a1-a19e-e6752621ccbe-kube-api-access-wbfts\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.073204 kubelet[2711]: I0115 13:46:31.073133 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-xtables-lock\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.074044 kubelet[2711]: I0115 13:46:31.073159 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-cni-bin-dir\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.074044 kubelet[2711]: I0115 13:46:31.073191 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-cni-net-dir\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.074044 kubelet[2711]: I0115 13:46:31.073219 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9c53573f-53e2-42a1-a19e-e6752621ccbe-node-certs\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.074044 kubelet[2711]: I0115 13:46:31.073246 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-cni-log-dir\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.074044 kubelet[2711]: I0115 13:46:31.073276 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9c53573f-53e2-42a1-a19e-e6752621ccbe-flexvol-driver-host\") pod \"calico-node-np8zb\" (UID: \"9c53573f-53e2-42a1-a19e-e6752621ccbe\") " pod="calico-system/calico-node-np8zb" Jan 15 13:46:31.075952 kubelet[2711]: I0115 13:46:31.075917 2711 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-net-dir\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076043 kubelet[2711]: I0115 13:46:31.075968 2711 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-bin-dir\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076043 kubelet[2711]: I0115 13:46:31.075988 2711 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-run-calico\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076043 kubelet[2711]: I0115 13:46:31.076006 2711 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-tg5zz\" (UniqueName: \"kubernetes.io/projected/c87ee128-b80e-461f-b0cc-3aafd8d5be53-kube-api-access-tg5zz\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076043 kubelet[2711]: I0115 13:46:31.076022 2711 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-lib-modules\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076043 kubelet[2711]: I0115 13:46:31.076037 2711 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-flexvol-driver-host\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076258 kubelet[2711]: I0115 13:46:31.076050 2711 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-xtables-lock\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076258 kubelet[2711]: I0115 13:46:31.076082 2711 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-cni-log-dir\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076258 kubelet[2711]: I0115 13:46:31.076095 2711 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c87ee128-b80e-461f-b0cc-3aafd8d5be53-node-certs\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076258 kubelet[2711]: I0115 13:46:31.076109 2711 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-var-lib-calico\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076258 kubelet[2711]: I0115 13:46:31.076122 2711 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c87ee128-b80e-461f-b0cc-3aafd8d5be53-tigera-ca-bundle\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.076258 kubelet[2711]: I0115 13:46:31.076139 2711 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c87ee128-b80e-461f-b0cc-3aafd8d5be53-policysync\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.177705 kubelet[2711]: I0115 13:46:31.176725 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c55259c-6a2f-4fd1-8729-52141d279855-tigera-ca-bundle\") pod \"9c55259c-6a2f-4fd1-8729-52141d279855\" (UID: \"9c55259c-6a2f-4fd1-8729-52141d279855\") " Jan 15 13:46:31.177705 kubelet[2711]: I0115 13:46:31.176808 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjk5\" (UniqueName: \"kubernetes.io/projected/9c55259c-6a2f-4fd1-8729-52141d279855-kube-api-access-dqjk5\") pod \"9c55259c-6a2f-4fd1-8729-52141d279855\" (UID: \"9c55259c-6a2f-4fd1-8729-52141d279855\") " Jan 15 13:46:31.186154 kubelet[2711]: I0115 13:46:31.186015 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c55259c-6a2f-4fd1-8729-52141d279855-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "9c55259c-6a2f-4fd1-8729-52141d279855" (UID: "9c55259c-6a2f-4fd1-8729-52141d279855"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 15 13:46:31.187861 kubelet[2711]: I0115 13:46:31.187809 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c55259c-6a2f-4fd1-8729-52141d279855-kube-api-access-dqjk5" (OuterVolumeSpecName: "kube-api-access-dqjk5") pod "9c55259c-6a2f-4fd1-8729-52141d279855" (UID: "9c55259c-6a2f-4fd1-8729-52141d279855"). InnerVolumeSpecName "kube-api-access-dqjk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 15 13:46:31.251248 containerd[1513]: time="2025-01-15T13:46:31.251147871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-np8zb,Uid:9c53573f-53e2-42a1-a19e-e6752621ccbe,Namespace:calico-system,Attempt:0,}" Jan 15 13:46:31.262169 systemd[1]: var-lib-kubelet-pods-9c55259c\x2d6a2f\x2d4fd1\x2d8729\x2d52141d279855-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Jan 15 13:46:31.262343 systemd[1]: run-netns-cni\x2d34e3dd84\x2d9997\x2dafc5\x2d023a\x2d88a2a05eb9f3.mount: Deactivated successfully. Jan 15 13:46:31.262489 systemd[1]: var-lib-kubelet-pods-c87ee128\x2db80e\x2d461f\x2db0cc\x2d3aafd8d5be53-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Jan 15 13:46:31.262730 systemd[1]: var-lib-kubelet-pods-9c55259c\x2d6a2f\x2d4fd1\x2d8729\x2d52141d279855-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddqjk5.mount: Deactivated successfully. Jan 15 13:46:31.262883 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19-rootfs.mount: Deactivated successfully. Jan 15 13:46:31.263027 systemd[1]: var-lib-kubelet-pods-c87ee128\x2db80e\x2d461f\x2db0cc\x2d3aafd8d5be53-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtg5zz.mount: Deactivated successfully. Jan 15 13:46:31.263371 systemd[1]: var-lib-kubelet-pods-c87ee128\x2db80e\x2d461f\x2db0cc\x2d3aafd8d5be53-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Jan 15 13:46:31.277841 kubelet[2711]: I0115 13:46:31.277789 2711 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c55259c-6a2f-4fd1-8729-52141d279855-tigera-ca-bundle\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.278992 kubelet[2711]: I0115 13:46:31.277840 2711 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-dqjk5\" (UniqueName: \"kubernetes.io/projected/9c55259c-6a2f-4fd1-8729-52141d279855-kube-api-access-dqjk5\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:31.322102 containerd[1513]: time="2025-01-15T13:46:31.320642926Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:46:31.322102 containerd[1513]: time="2025-01-15T13:46:31.320778141Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:46:31.322102 containerd[1513]: time="2025-01-15T13:46:31.320809079Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:31.322102 containerd[1513]: time="2025-01-15T13:46:31.320945912Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:31.373733 systemd[1]: Started cri-containerd-6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e.scope - libcontainer container 6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e. Jan 15 13:46:31.425924 systemd[1]: cri-containerd-c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935.scope: Deactivated successfully. Jan 15 13:46:31.442352 containerd[1513]: time="2025-01-15T13:46:31.442255936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-np8zb,Uid:9c53573f-53e2-42a1-a19e-e6752621ccbe,Namespace:calico-system,Attempt:0,} returns sandbox id \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\"" Jan 15 13:46:31.448698 containerd[1513]: time="2025-01-15T13:46:31.448496922Z" level=info msg="CreateContainer within sandbox \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 15 13:46:31.476960 containerd[1513]: time="2025-01-15T13:46:31.476720882Z" level=info msg="CreateContainer within sandbox \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f\"" Jan 15 13:46:31.481505 containerd[1513]: time="2025-01-15T13:46:31.479061309Z" level=info msg="StartContainer for \"338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f\"" Jan 15 13:46:31.484533 containerd[1513]: time="2025-01-15T13:46:31.484464909Z" level=info msg="shim disconnected" id=c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935 namespace=k8s.io Jan 15 13:46:31.484625 containerd[1513]: time="2025-01-15T13:46:31.484532504Z" level=warning msg="cleaning up after shim disconnected" id=c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935 namespace=k8s.io Jan 15 13:46:31.484625 containerd[1513]: time="2025-01-15T13:46:31.484548185Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:31.506151 kubelet[2711]: I0115 13:46:31.505867 2711 scope.go:117] "RemoveContainer" containerID="0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e" Jan 15 13:46:31.544651 containerd[1513]: time="2025-01-15T13:46:31.544553738Z" level=info msg="RemoveContainer for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\"" Jan 15 13:46:31.581458 systemd[1]: Started cri-containerd-338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f.scope - libcontainer container 338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f. Jan 15 13:46:31.582253 systemd[1]: Removed slice kubepods-besteffort-pod9c55259c_6a2f_4fd1_8729_52141d279855.slice - libcontainer container kubepods-besteffort-pod9c55259c_6a2f_4fd1_8729_52141d279855.slice. Jan 15 13:46:31.582883 systemd[1]: Removed slice kubepods-besteffort-podc87ee128_b80e_461f_b0cc_3aafd8d5be53.slice - libcontainer container kubepods-besteffort-podc87ee128_b80e_461f_b0cc_3aafd8d5be53.slice. Jan 15 13:46:31.583015 systemd[1]: kubepods-besteffort-podc87ee128_b80e_461f_b0cc_3aafd8d5be53.slice: Consumed 4.251s CPU time. Jan 15 13:46:31.619711 containerd[1513]: time="2025-01-15T13:46:31.619609698Z" level=info msg="StopContainer for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" returns successfully" Jan 15 13:46:31.622219 containerd[1513]: time="2025-01-15T13:46:31.620523607Z" level=info msg="StopPodSandbox for \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\"" Jan 15 13:46:31.622219 containerd[1513]: time="2025-01-15T13:46:31.620588659Z" level=info msg="Container to stop \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 15 13:46:31.644208 systemd[1]: cri-containerd-086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99.scope: Deactivated successfully. Jan 15 13:46:31.674589 containerd[1513]: time="2025-01-15T13:46:31.673997839Z" level=info msg="RemoveContainer for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" returns successfully" Jan 15 13:46:31.688387 kubelet[2711]: I0115 13:46:31.688260 2711 scope.go:117] "RemoveContainer" containerID="70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417" Jan 15 13:46:31.694785 containerd[1513]: time="2025-01-15T13:46:31.694730491Z" level=info msg="RemoveContainer for \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\"" Jan 15 13:46:31.706187 containerd[1513]: time="2025-01-15T13:46:31.706007138Z" level=info msg="RemoveContainer for \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\" returns successfully" Jan 15 13:46:31.708823 kubelet[2711]: I0115 13:46:31.708264 2711 scope.go:117] "RemoveContainer" containerID="f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d" Jan 15 13:46:31.716472 containerd[1513]: time="2025-01-15T13:46:31.715931281Z" level=info msg="RemoveContainer for \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\"" Jan 15 13:46:31.736994 containerd[1513]: time="2025-01-15T13:46:31.736848529Z" level=info msg="RemoveContainer for \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\" returns successfully" Jan 15 13:46:31.738507 kubelet[2711]: I0115 13:46:31.737975 2711 scope.go:117] "RemoveContainer" containerID="0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e" Jan 15 13:46:31.790262 containerd[1513]: time="2025-01-15T13:46:31.753496756Z" level=error msg="ContainerStatus for \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\": not found" Jan 15 13:46:31.815002 containerd[1513]: time="2025-01-15T13:46:31.772553862Z" level=info msg="shim disconnected" id=086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99 namespace=k8s.io Jan 15 13:46:31.815886 containerd[1513]: time="2025-01-15T13:46:31.815194109Z" level=warning msg="cleaning up after shim disconnected" id=086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99 namespace=k8s.io Jan 15 13:46:31.815886 containerd[1513]: time="2025-01-15T13:46:31.815222182Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:31.816069 kubelet[2711]: E0115 13:46:31.815605 2711 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\": not found" containerID="0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e" Jan 15 13:46:31.816069 kubelet[2711]: I0115 13:46:31.815681 2711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e"} err="failed to get container status \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\": rpc error: code = NotFound desc = an error occurred when try to find container \"0c28707ebcfd4e5c69faaa11fe47afae91b7ada291ee60b7877ac9a2f8ce619e\": not found" Jan 15 13:46:31.816069 kubelet[2711]: I0115 13:46:31.815735 2711 scope.go:117] "RemoveContainer" containerID="70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417" Jan 15 13:46:31.816274 containerd[1513]: time="2025-01-15T13:46:31.816165824Z" level=error msg="ContainerStatus for \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\": not found" Jan 15 13:46:31.818387 kubelet[2711]: E0115 13:46:31.818226 2711 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\": not found" containerID="70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417" Jan 15 13:46:31.818387 kubelet[2711]: I0115 13:46:31.818266 2711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417"} err="failed to get container status \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\": rpc error: code = NotFound desc = an error occurred when try to find container \"70358fd492eca6eef049fe31d569396883fe66bb0009869da81bed2c5ae48417\": not found" Jan 15 13:46:31.818387 kubelet[2711]: I0115 13:46:31.818292 2711 scope.go:117] "RemoveContainer" containerID="f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d" Jan 15 13:46:31.820265 containerd[1513]: time="2025-01-15T13:46:31.819255034Z" level=error msg="ContainerStatus for \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\": not found" Jan 15 13:46:31.820635 kubelet[2711]: E0115 13:46:31.820606 2711 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\": not found" containerID="f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d" Jan 15 13:46:31.820799 kubelet[2711]: I0115 13:46:31.820766 2711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d"} err="failed to get container status \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\": rpc error: code = NotFound desc = an error occurred when try to find container \"f716dd0a0e967c16fd054ad526a376e4c080ab24353467d64cfbae478406484d\": not found" Jan 15 13:46:31.820946 kubelet[2711]: I0115 13:46:31.820924 2711 scope.go:117] "RemoveContainer" containerID="b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3" Jan 15 13:46:31.821398 containerd[1513]: time="2025-01-15T13:46:31.821366132Z" level=info msg="StartContainer for \"338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f\" returns successfully" Jan 15 13:46:31.827342 containerd[1513]: time="2025-01-15T13:46:31.827288224Z" level=info msg="RemoveContainer for \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\"" Jan 15 13:46:31.833880 containerd[1513]: time="2025-01-15T13:46:31.833837201Z" level=info msg="RemoveContainer for \"b441bfec94971c385fa9857fefd69f144f81933e26becca49be4ada284810cb3\" returns successfully" Jan 15 13:46:31.879325 containerd[1513]: time="2025-01-15T13:46:31.879254981Z" level=info msg="TearDown network for sandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" successfully" Jan 15 13:46:31.879611 containerd[1513]: time="2025-01-15T13:46:31.879488688Z" level=info msg="StopPodSandbox for \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" returns successfully" Jan 15 13:46:31.897365 kubelet[2711]: I0115 13:46:31.897304 2711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c55259c-6a2f-4fd1-8729-52141d279855" path="/var/lib/kubelet/pods/9c55259c-6a2f-4fd1-8729-52141d279855/volumes" Jan 15 13:46:31.899020 systemd[1]: cri-containerd-338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f.scope: Deactivated successfully. Jan 15 13:46:31.903456 kubelet[2711]: I0115 13:46:31.902313 2711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87ee128-b80e-461f-b0cc-3aafd8d5be53" path="/var/lib/kubelet/pods/c87ee128-b80e-461f-b0cc-3aafd8d5be53/volumes" Jan 15 13:46:31.954584 containerd[1513]: time="2025-01-15T13:46:31.954429523Z" level=info msg="shim disconnected" id=338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f namespace=k8s.io Jan 15 13:46:31.955341 containerd[1513]: time="2025-01-15T13:46:31.955077620Z" level=warning msg="cleaning up after shim disconnected" id=338dfd2173f55bb455869c2d351d5502a4a0afab80722fb5057261c1a7246e2f namespace=k8s.io Jan 15 13:46:31.955341 containerd[1513]: time="2025-01-15T13:46:31.955100893Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:31.990210 kubelet[2711]: I0115 13:46:31.989949 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-typha-certs\") pod \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\" (UID: \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\") " Jan 15 13:46:31.990210 kubelet[2711]: I0115 13:46:31.990049 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-tigera-ca-bundle\") pod \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\" (UID: \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\") " Jan 15 13:46:31.990210 kubelet[2711]: I0115 13:46:31.990095 2711 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dftq\" (UniqueName: \"kubernetes.io/projected/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-kube-api-access-6dftq\") pod \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\" (UID: \"4c7ae6ba-25db-42e4-a2dd-054903d9d6d2\") " Jan 15 13:46:32.000416 kubelet[2711]: I0115 13:46:32.000242 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" (UID: "4c7ae6ba-25db-42e4-a2dd-054903d9d6d2"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 15 13:46:32.001398 kubelet[2711]: I0115 13:46:32.001200 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-kube-api-access-6dftq" (OuterVolumeSpecName: "kube-api-access-6dftq") pod "4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" (UID: "4c7ae6ba-25db-42e4-a2dd-054903d9d6d2"). InnerVolumeSpecName "kube-api-access-6dftq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 15 13:46:32.016032 kubelet[2711]: I0115 13:46:32.014844 2711 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" (UID: "4c7ae6ba-25db-42e4-a2dd-054903d9d6d2"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 15 13:46:32.090918 kubelet[2711]: I0115 13:46:32.090546 2711 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-6dftq\" (UniqueName: \"kubernetes.io/projected/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-kube-api-access-6dftq\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:32.090918 kubelet[2711]: I0115 13:46:32.090603 2711 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-typha-certs\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:32.090918 kubelet[2711]: I0115 13:46:32.090624 2711 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2-tigera-ca-bundle\") on node \"srv-6yg2e.gb1.brightbox.com\" DevicePath \"\"" Jan 15 13:46:32.255407 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935-rootfs.mount: Deactivated successfully. Jan 15 13:46:32.255587 systemd[1]: var-lib-kubelet-pods-4c7ae6ba\x2d25db\x2d42e4\x2da2dd\x2d054903d9d6d2-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Jan 15 13:46:32.255710 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99-rootfs.mount: Deactivated successfully. Jan 15 13:46:32.255818 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99-shm.mount: Deactivated successfully. Jan 15 13:46:32.255916 systemd[1]: var-lib-kubelet-pods-4c7ae6ba\x2d25db\x2d42e4\x2da2dd\x2d054903d9d6d2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6dftq.mount: Deactivated successfully. Jan 15 13:46:32.256024 systemd[1]: var-lib-kubelet-pods-4c7ae6ba\x2d25db\x2d42e4\x2da2dd\x2d054903d9d6d2-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Jan 15 13:46:32.576922 kubelet[2711]: I0115 13:46:32.576864 2711 scope.go:117] "RemoveContainer" containerID="c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935" Jan 15 13:46:32.587526 containerd[1513]: time="2025-01-15T13:46:32.586681879Z" level=info msg="RemoveContainer for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\"" Jan 15 13:46:32.586887 systemd[1]: Removed slice kubepods-besteffort-pod4c7ae6ba_25db_42e4_a2dd_054903d9d6d2.slice - libcontainer container kubepods-besteffort-pod4c7ae6ba_25db_42e4_a2dd_054903d9d6d2.slice. Jan 15 13:46:32.594051 containerd[1513]: time="2025-01-15T13:46:32.594017189Z" level=info msg="RemoveContainer for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" returns successfully" Jan 15 13:46:32.595898 kubelet[2711]: I0115 13:46:32.595854 2711 scope.go:117] "RemoveContainer" containerID="c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935" Jan 15 13:46:32.597594 containerd[1513]: time="2025-01-15T13:46:32.597551155Z" level=error msg="ContainerStatus for \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\": not found" Jan 15 13:46:32.597784 kubelet[2711]: E0115 13:46:32.597754 2711 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\": not found" containerID="c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935" Jan 15 13:46:32.597854 kubelet[2711]: I0115 13:46:32.597795 2711 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935"} err="failed to get container status \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\": rpc error: code = NotFound desc = an error occurred when try to find container \"c77643eb06d2a62c249948e74f3301e42a1bc05d29f60ed19fadf499404cf935\": not found" Jan 15 13:46:32.604217 containerd[1513]: time="2025-01-15T13:46:32.603999600Z" level=info msg="CreateContainer within sandbox \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 15 13:46:32.676387 containerd[1513]: time="2025-01-15T13:46:32.676303635Z" level=info msg="CreateContainer within sandbox \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6\"" Jan 15 13:46:32.678965 containerd[1513]: time="2025-01-15T13:46:32.677360305Z" level=info msg="StartContainer for \"398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6\"" Jan 15 13:46:32.731723 systemd[1]: Started cri-containerd-398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6.scope - libcontainer container 398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6. Jan 15 13:46:32.776539 containerd[1513]: time="2025-01-15T13:46:32.776451687Z" level=info msg="StartContainer for \"398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6\" returns successfully" Jan 15 13:46:33.387641 kubelet[2711]: I0115 13:46:33.387540 2711 topology_manager.go:215] "Topology Admit Handler" podUID="4d2bce4c-8f19-4138-a51d-57a940b1e9d2" podNamespace="calico-system" podName="calico-typha-8c5777b74-fvgzh" Jan 15 13:46:33.388496 kubelet[2711]: E0115 13:46:33.387679 2711 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" containerName="calico-typha" Jan 15 13:46:33.388496 kubelet[2711]: E0115 13:46:33.387712 2711 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="9c55259c-6a2f-4fd1-8729-52141d279855" containerName="calico-kube-controllers" Jan 15 13:46:33.388496 kubelet[2711]: I0115 13:46:33.387757 2711 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c55259c-6a2f-4fd1-8729-52141d279855" containerName="calico-kube-controllers" Jan 15 13:46:33.388496 kubelet[2711]: I0115 13:46:33.387771 2711 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" containerName="calico-typha" Jan 15 13:46:33.405531 systemd[1]: Created slice kubepods-besteffort-pod4d2bce4c_8f19_4138_a51d_57a940b1e9d2.slice - libcontainer container kubepods-besteffort-pod4d2bce4c_8f19_4138_a51d_57a940b1e9d2.slice. Jan 15 13:46:33.511237 kubelet[2711]: I0115 13:46:33.511139 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4d2bce4c-8f19-4138-a51d-57a940b1e9d2-typha-certs\") pod \"calico-typha-8c5777b74-fvgzh\" (UID: \"4d2bce4c-8f19-4138-a51d-57a940b1e9d2\") " pod="calico-system/calico-typha-8c5777b74-fvgzh" Jan 15 13:46:33.511237 kubelet[2711]: I0115 13:46:33.511233 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d2bce4c-8f19-4138-a51d-57a940b1e9d2-tigera-ca-bundle\") pod \"calico-typha-8c5777b74-fvgzh\" (UID: \"4d2bce4c-8f19-4138-a51d-57a940b1e9d2\") " pod="calico-system/calico-typha-8c5777b74-fvgzh" Jan 15 13:46:33.511670 kubelet[2711]: I0115 13:46:33.511273 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxhl5\" (UniqueName: \"kubernetes.io/projected/4d2bce4c-8f19-4138-a51d-57a940b1e9d2-kube-api-access-sxhl5\") pod \"calico-typha-8c5777b74-fvgzh\" (UID: \"4d2bce4c-8f19-4138-a51d-57a940b1e9d2\") " pod="calico-system/calico-typha-8c5777b74-fvgzh" Jan 15 13:46:33.712661 containerd[1513]: time="2025-01-15T13:46:33.712430208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8c5777b74-fvgzh,Uid:4d2bce4c-8f19-4138-a51d-57a940b1e9d2,Namespace:calico-system,Attempt:0,}" Jan 15 13:46:33.774637 containerd[1513]: time="2025-01-15T13:46:33.773411791Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:46:33.774637 containerd[1513]: time="2025-01-15T13:46:33.773857680Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:46:33.774637 containerd[1513]: time="2025-01-15T13:46:33.773937131Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:33.774637 containerd[1513]: time="2025-01-15T13:46:33.774325934Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:33.826310 systemd[1]: Started cri-containerd-d34cd5a70cda19e5a90237ff3e697acb22e68bb9673c99614f27297fc0c21532.scope - libcontainer container d34cd5a70cda19e5a90237ff3e697acb22e68bb9673c99614f27297fc0c21532. Jan 15 13:46:33.899715 kubelet[2711]: I0115 13:46:33.899598 2711 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7ae6ba-25db-42e4-a2dd-054903d9d6d2" path="/var/lib/kubelet/pods/4c7ae6ba-25db-42e4-a2dd-054903d9d6d2/volumes" Jan 15 13:46:33.937314 containerd[1513]: time="2025-01-15T13:46:33.937069090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8c5777b74-fvgzh,Uid:4d2bce4c-8f19-4138-a51d-57a940b1e9d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"d34cd5a70cda19e5a90237ff3e697acb22e68bb9673c99614f27297fc0c21532\"" Jan 15 13:46:33.959926 containerd[1513]: time="2025-01-15T13:46:33.959527772Z" level=info msg="CreateContainer within sandbox \"d34cd5a70cda19e5a90237ff3e697acb22e68bb9673c99614f27297fc0c21532\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 15 13:46:33.992847 containerd[1513]: time="2025-01-15T13:46:33.991242445Z" level=info msg="CreateContainer within sandbox \"d34cd5a70cda19e5a90237ff3e697acb22e68bb9673c99614f27297fc0c21532\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1fb43674ffde14431c4306c5caeafcf4155e3d57c0a1c2b0ae68139c0ea760d0\"" Jan 15 13:46:33.997519 containerd[1513]: time="2025-01-15T13:46:33.997136021Z" level=info msg="StartContainer for \"1fb43674ffde14431c4306c5caeafcf4155e3d57c0a1c2b0ae68139c0ea760d0\"" Jan 15 13:46:34.047395 systemd[1]: Started cri-containerd-1fb43674ffde14431c4306c5caeafcf4155e3d57c0a1c2b0ae68139c0ea760d0.scope - libcontainer container 1fb43674ffde14431c4306c5caeafcf4155e3d57c0a1c2b0ae68139c0ea760d0. Jan 15 13:46:34.127084 containerd[1513]: time="2025-01-15T13:46:34.126155614Z" level=info msg="StartContainer for \"1fb43674ffde14431c4306c5caeafcf4155e3d57c0a1c2b0ae68139c0ea760d0\" returns successfully" Jan 15 13:46:34.680614 kubelet[2711]: I0115 13:46:34.679362 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8c5777b74-fvgzh" podStartSLOduration=5.671408867 podStartE2EDuration="5.671408867s" podCreationTimestamp="2025-01-15 13:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:46:34.67057581 +0000 UTC m=+88.923253911" watchObservedRunningTime="2025-01-15 13:46:34.671408867 +0000 UTC m=+88.924086962" Jan 15 13:46:34.690060 systemd[1]: cri-containerd-398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6.scope: Deactivated successfully. Jan 15 13:46:34.690406 systemd[1]: cri-containerd-398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6.scope: Consumed 1.198s CPU time. Jan 15 13:46:34.730365 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6-rootfs.mount: Deactivated successfully. Jan 15 13:46:34.740797 containerd[1513]: time="2025-01-15T13:46:34.740593652Z" level=info msg="shim disconnected" id=398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6 namespace=k8s.io Jan 15 13:46:34.740797 containerd[1513]: time="2025-01-15T13:46:34.740734149Z" level=warning msg="cleaning up after shim disconnected" id=398526cea36021dd44b12bf2fd7b74f2d844eb3f79ee5fda84386853c6617de6 namespace=k8s.io Jan 15 13:46:34.740797 containerd[1513]: time="2025-01-15T13:46:34.740755308Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 15 13:46:35.671836 containerd[1513]: time="2025-01-15T13:46:35.671743402Z" level=info msg="CreateContainer within sandbox \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 15 13:46:35.699973 containerd[1513]: time="2025-01-15T13:46:35.699895798Z" level=info msg="CreateContainer within sandbox \"6b5d022ed8710eeb1ad16698587a49112ef48687bfdf642d805f477959e9697e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3\"" Jan 15 13:46:35.701854 containerd[1513]: time="2025-01-15T13:46:35.701810632Z" level=info msg="StartContainer for \"ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3\"" Jan 15 13:46:35.754748 systemd[1]: Started cri-containerd-ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3.scope - libcontainer container ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3. Jan 15 13:46:35.806671 containerd[1513]: time="2025-01-15T13:46:35.806589725Z" level=info msg="StartContainer for \"ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3\" returns successfully" Jan 15 13:46:35.999549 kubelet[2711]: I0115 13:46:35.999467 2711 topology_manager.go:215] "Topology Admit Handler" podUID="19d678ac-39dd-40db-be3a-395302c4ac93" podNamespace="calico-system" podName="calico-kube-controllers-7fcbdb7cdc-8wf4j" Jan 15 13:46:36.013874 systemd[1]: Created slice kubepods-besteffort-pod19d678ac_39dd_40db_be3a_395302c4ac93.slice - libcontainer container kubepods-besteffort-pod19d678ac_39dd_40db_be3a_395302c4ac93.slice. Jan 15 13:46:36.051943 kubelet[2711]: I0115 13:46:36.051820 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk8bg\" (UniqueName: \"kubernetes.io/projected/19d678ac-39dd-40db-be3a-395302c4ac93-kube-api-access-fk8bg\") pod \"calico-kube-controllers-7fcbdb7cdc-8wf4j\" (UID: \"19d678ac-39dd-40db-be3a-395302c4ac93\") " pod="calico-system/calico-kube-controllers-7fcbdb7cdc-8wf4j" Jan 15 13:46:36.052583 kubelet[2711]: I0115 13:46:36.052541 2711 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19d678ac-39dd-40db-be3a-395302c4ac93-tigera-ca-bundle\") pod \"calico-kube-controllers-7fcbdb7cdc-8wf4j\" (UID: \"19d678ac-39dd-40db-be3a-395302c4ac93\") " pod="calico-system/calico-kube-controllers-7fcbdb7cdc-8wf4j" Jan 15 13:46:36.319329 containerd[1513]: time="2025-01-15T13:46:36.319092482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7fcbdb7cdc-8wf4j,Uid:19d678ac-39dd-40db-be3a-395302c4ac93,Namespace:calico-system,Attempt:0,}" Jan 15 13:46:36.497817 systemd-networkd[1436]: calid395778c18d: Link UP Jan 15 13:46:36.498588 systemd-networkd[1436]: calid395778c18d: Gained carrier Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.394 [INFO][6075] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0 calico-kube-controllers-7fcbdb7cdc- calico-system 19d678ac-39dd-40db-be3a-395302c4ac93 1094 0 2025-01-15 13:46:31 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7fcbdb7cdc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-6yg2e.gb1.brightbox.com calico-kube-controllers-7fcbdb7cdc-8wf4j eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid395778c18d [] []}} ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.394 [INFO][6075] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.433 [INFO][6087] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" HandleID="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.448 [INFO][6087] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" HandleID="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000292b70), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-6yg2e.gb1.brightbox.com", "pod":"calico-kube-controllers-7fcbdb7cdc-8wf4j", "timestamp":"2025-01-15 13:46:36.43368995 +0000 UTC"}, Hostname:"srv-6yg2e.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.448 [INFO][6087] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.448 [INFO][6087] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.448 [INFO][6087] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6yg2e.gb1.brightbox.com' Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.451 [INFO][6087] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.457 [INFO][6087] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.463 [INFO][6087] ipam/ipam.go 489: Trying affinity for 192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.466 [INFO][6087] ipam/ipam.go 155: Attempting to load block cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.469 [INFO][6087] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.95.192/26 host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.469 [INFO][6087] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.95.192/26 handle="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.472 [INFO][6087] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8 Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.478 [INFO][6087] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.95.192/26 handle="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.488 [INFO][6087] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.95.199/26] block=192.168.95.192/26 handle="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.488 [INFO][6087] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.95.199/26] handle="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" host="srv-6yg2e.gb1.brightbox.com" Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.488 [INFO][6087] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:46:36.524120 containerd[1513]: 2025-01-15 13:46:36.489 [INFO][6087] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.95.199/26] IPv6=[] ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" HandleID="k8s-pod-network.dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.535071 containerd[1513]: 2025-01-15 13:46:36.492 [INFO][6075] cni-plugin/k8s.go 386: Populated endpoint ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0", GenerateName:"calico-kube-controllers-7fcbdb7cdc-", Namespace:"calico-system", SelfLink:"", UID:"19d678ac-39dd-40db-be3a-395302c4ac93", ResourceVersion:"1094", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 46, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7fcbdb7cdc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-7fcbdb7cdc-8wf4j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid395778c18d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:36.535071 containerd[1513]: 2025-01-15 13:46:36.492 [INFO][6075] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.95.199/32] ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.535071 containerd[1513]: 2025-01-15 13:46:36.492 [INFO][6075] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid395778c18d ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.535071 containerd[1513]: 2025-01-15 13:46:36.499 [INFO][6075] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.535071 containerd[1513]: 2025-01-15 13:46:36.500 [INFO][6075] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0", GenerateName:"calico-kube-controllers-7fcbdb7cdc-", Namespace:"calico-system", SelfLink:"", UID:"19d678ac-39dd-40db-be3a-395302c4ac93", ResourceVersion:"1094", Generation:0, CreationTimestamp:time.Date(2025, time.January, 15, 13, 46, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7fcbdb7cdc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6yg2e.gb1.brightbox.com", ContainerID:"dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8", Pod:"calico-kube-controllers-7fcbdb7cdc-8wf4j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid395778c18d", MAC:"1e:75:45:91:74:a1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 15 13:46:36.535071 containerd[1513]: 2025-01-15 13:46:36.515 [INFO][6075] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8" Namespace="calico-system" Pod="calico-kube-controllers-7fcbdb7cdc-8wf4j" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--7fcbdb7cdc--8wf4j-eth0" Jan 15 13:46:36.578512 containerd[1513]: time="2025-01-15T13:46:36.574180364Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 15 13:46:36.578512 containerd[1513]: time="2025-01-15T13:46:36.574262907Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 15 13:46:36.578512 containerd[1513]: time="2025-01-15T13:46:36.574285140Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:36.578512 containerd[1513]: time="2025-01-15T13:46:36.574389599Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 15 13:46:36.622651 systemd[1]: Started cri-containerd-dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8.scope - libcontainer container dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8. Jan 15 13:46:36.766831 containerd[1513]: time="2025-01-15T13:46:36.766190793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7fcbdb7cdc-8wf4j,Uid:19d678ac-39dd-40db-be3a-395302c4ac93,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8\"" Jan 15 13:46:36.835191 containerd[1513]: time="2025-01-15T13:46:36.835054094Z" level=info msg="CreateContainer within sandbox \"dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 15 13:46:36.863522 containerd[1513]: time="2025-01-15T13:46:36.863287484Z" level=info msg="CreateContainer within sandbox \"dd64aa54a6a527667c46817b1345b4099517434e7ae3a9b9b6d2383f4f628ed8\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a6d953fd502ffb5c7d5ed35b5d50326677bf83ed80f9d6bd6622eda5997981a3\"" Jan 15 13:46:36.864540 containerd[1513]: time="2025-01-15T13:46:36.864492417Z" level=info msg="StartContainer for \"a6d953fd502ffb5c7d5ed35b5d50326677bf83ed80f9d6bd6622eda5997981a3\"" Jan 15 13:46:36.904692 systemd[1]: Started cri-containerd-a6d953fd502ffb5c7d5ed35b5d50326677bf83ed80f9d6bd6622eda5997981a3.scope - libcontainer container a6d953fd502ffb5c7d5ed35b5d50326677bf83ed80f9d6bd6622eda5997981a3. Jan 15 13:46:36.970197 containerd[1513]: time="2025-01-15T13:46:36.969984969Z" level=info msg="StartContainer for \"a6d953fd502ffb5c7d5ed35b5d50326677bf83ed80f9d6bd6622eda5997981a3\" returns successfully" Jan 15 13:46:37.853954 systemd[1]: run-containerd-runc-k8s.io-ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3-runc.4nWsMl.mount: Deactivated successfully. Jan 15 13:46:37.938709 kubelet[2711]: I0115 13:46:37.938062 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7fcbdb7cdc-8wf4j" podStartSLOduration=6.938008627 podStartE2EDuration="6.938008627s" podCreationTimestamp="2025-01-15 13:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:46:37.92685092 +0000 UTC m=+92.179529015" watchObservedRunningTime="2025-01-15 13:46:37.938008627 +0000 UTC m=+92.190686720" Jan 15 13:46:37.940773 kubelet[2711]: I0115 13:46:37.939765 2711 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-np8zb" podStartSLOduration=7.939754604 podStartE2EDuration="7.939754604s" podCreationTimestamp="2025-01-15 13:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-15 13:46:36.692927776 +0000 UTC m=+90.945605867" watchObservedRunningTime="2025-01-15 13:46:37.939754604 +0000 UTC m=+92.192432690" Jan 15 13:46:38.202629 systemd-networkd[1436]: calid395778c18d: Gained IPv6LL Jan 15 13:46:38.778206 systemd[1]: run-containerd-runc-k8s.io-a6d953fd502ffb5c7d5ed35b5d50326677bf83ed80f9d6bd6622eda5997981a3-runc.Yvbnhk.mount: Deactivated successfully. Jan 15 13:47:01.279066 systemd[1]: run-containerd-runc-k8s.io-ac6c834c8b82e2ddbf0905e4f433cde72cac098f02aeae232fc41baabfaf00e3-runc.vM1lER.mount: Deactivated successfully. Jan 15 13:47:08.114909 containerd[1513]: time="2025-01-15T13:47:08.106781620Z" level=info msg="StopPodSandbox for \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\"" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.216 [WARNING][7075] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.217 [INFO][7075] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.217 [INFO][7075] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" iface="eth0" netns="" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.217 [INFO][7075] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.217 [INFO][7075] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.335 [INFO][7081] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.337 [INFO][7081] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.337 [INFO][7081] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.348 [WARNING][7081] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.348 [INFO][7081] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.350 [INFO][7081] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:47:08.355131 containerd[1513]: 2025-01-15 13:47:08.353 [INFO][7075] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.369126 containerd[1513]: time="2025-01-15T13:47:08.368491606Z" level=info msg="TearDown network for sandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" successfully" Jan 15 13:47:08.369836 containerd[1513]: time="2025-01-15T13:47:08.369317816Z" level=info msg="StopPodSandbox for \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" returns successfully" Jan 15 13:47:08.377324 containerd[1513]: time="2025-01-15T13:47:08.377263761Z" level=info msg="RemovePodSandbox for \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\"" Jan 15 13:47:08.386027 containerd[1513]: time="2025-01-15T13:47:08.385958282Z" level=info msg="Forcibly stopping sandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\"" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.439 [WARNING][7099] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" WorkloadEndpoint="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.439 [INFO][7099] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.439 [INFO][7099] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" iface="eth0" netns="" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.439 [INFO][7099] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.439 [INFO][7099] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.467 [INFO][7105] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.467 [INFO][7105] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.468 [INFO][7105] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.478 [WARNING][7105] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.478 [INFO][7105] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" HandleID="k8s-pod-network.cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Workload="srv--6yg2e.gb1.brightbox.com-k8s-calico--kube--controllers--8496b9bc76--lh2fq-eth0" Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.480 [INFO][7105] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 15 13:47:08.484627 containerd[1513]: 2025-01-15 13:47:08.482 [INFO][7099] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6" Jan 15 13:47:08.486281 containerd[1513]: time="2025-01-15T13:47:08.484686926Z" level=info msg="TearDown network for sandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" successfully" Jan 15 13:47:08.505398 containerd[1513]: time="2025-01-15T13:47:08.505270086Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:47:08.505613 containerd[1513]: time="2025-01-15T13:47:08.505429904Z" level=info msg="RemovePodSandbox \"cb59a460f869564ab60fef6836d39bb850b1ab1be941911cde4b01b317e17ce6\" returns successfully" Jan 15 13:47:08.506233 containerd[1513]: time="2025-01-15T13:47:08.506184746Z" level=info msg="StopPodSandbox for \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\"" Jan 15 13:47:08.506339 containerd[1513]: time="2025-01-15T13:47:08.506322728Z" level=info msg="TearDown network for sandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" successfully" Jan 15 13:47:08.506402 containerd[1513]: time="2025-01-15T13:47:08.506345143Z" level=info msg="StopPodSandbox for \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" returns successfully" Jan 15 13:47:08.506935 containerd[1513]: time="2025-01-15T13:47:08.506899576Z" level=info msg="RemovePodSandbox for \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\"" Jan 15 13:47:08.507051 containerd[1513]: time="2025-01-15T13:47:08.506936670Z" level=info msg="Forcibly stopping sandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\"" Jan 15 13:47:08.507051 containerd[1513]: time="2025-01-15T13:47:08.507002717Z" level=info msg="TearDown network for sandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" successfully" Jan 15 13:47:08.512710 containerd[1513]: time="2025-01-15T13:47:08.512630792Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:47:08.512825 containerd[1513]: time="2025-01-15T13:47:08.512798117Z" level=info msg="RemovePodSandbox \"086ac8a46b6a4e54eef28d01f0690077bd482ff263996f21cf4efe95ff410c99\" returns successfully" Jan 15 13:47:08.513478 containerd[1513]: time="2025-01-15T13:47:08.513376308Z" level=info msg="StopPodSandbox for \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\"" Jan 15 13:47:08.513578 containerd[1513]: time="2025-01-15T13:47:08.513549200Z" level=info msg="TearDown network for sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" successfully" Jan 15 13:47:08.513578 containerd[1513]: time="2025-01-15T13:47:08.513570813Z" level=info msg="StopPodSandbox for \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" returns successfully" Jan 15 13:47:08.514261 containerd[1513]: time="2025-01-15T13:47:08.513950305Z" level=info msg="RemovePodSandbox for \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\"" Jan 15 13:47:08.514261 containerd[1513]: time="2025-01-15T13:47:08.514027483Z" level=info msg="Forcibly stopping sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\"" Jan 15 13:47:08.514261 containerd[1513]: time="2025-01-15T13:47:08.514098115Z" level=info msg="TearDown network for sandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" successfully" Jan 15 13:47:08.518073 containerd[1513]: time="2025-01-15T13:47:08.518015549Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 15 13:47:08.518223 containerd[1513]: time="2025-01-15T13:47:08.518083861Z" level=info msg="RemovePodSandbox \"f89e45dc25050936e23406dd32dc2465ae47034ff1ead3659e833946c3d08f19\" returns successfully" Jan 15 13:47:25.602048 systemd[1]: Started sshd@7-10.230.66.218:22-147.75.109.163:59130.service - OpenSSH per-connection server daemon (147.75.109.163:59130). Jan 15 13:47:26.564510 sshd[7142]: Accepted publickey for core from 147.75.109.163 port 59130 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:26.568528 sshd[7142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:26.579560 systemd-logind[1493]: New session 10 of user core. Jan 15 13:47:26.582714 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 15 13:47:27.742319 sshd[7142]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:27.749654 systemd[1]: sshd@7-10.230.66.218:22-147.75.109.163:59130.service: Deactivated successfully. Jan 15 13:47:27.752471 systemd[1]: session-10.scope: Deactivated successfully. Jan 15 13:47:27.753681 systemd-logind[1493]: Session 10 logged out. Waiting for processes to exit. Jan 15 13:47:27.756520 systemd-logind[1493]: Removed session 10. Jan 15 13:47:32.907605 systemd[1]: Started sshd@8-10.230.66.218:22-147.75.109.163:49392.service - OpenSSH per-connection server daemon (147.75.109.163:49392). Jan 15 13:47:33.846670 sshd[7178]: Accepted publickey for core from 147.75.109.163 port 49392 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:33.849744 sshd[7178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:33.860485 systemd-logind[1493]: New session 11 of user core. Jan 15 13:47:33.865657 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 15 13:47:34.626524 sshd[7178]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:34.632256 systemd[1]: sshd@8-10.230.66.218:22-147.75.109.163:49392.service: Deactivated successfully. Jan 15 13:47:34.636722 systemd[1]: session-11.scope: Deactivated successfully. Jan 15 13:47:34.639653 systemd-logind[1493]: Session 11 logged out. Waiting for processes to exit. Jan 15 13:47:34.641852 systemd-logind[1493]: Removed session 11. Jan 15 13:47:39.784881 systemd[1]: Started sshd@9-10.230.66.218:22-147.75.109.163:35416.service - OpenSSH per-connection server daemon (147.75.109.163:35416). Jan 15 13:47:40.712975 sshd[7230]: Accepted publickey for core from 147.75.109.163 port 35416 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:40.715491 sshd[7230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:40.725354 systemd-logind[1493]: New session 12 of user core. Jan 15 13:47:40.729684 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 15 13:47:41.427357 sshd[7230]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:41.433166 systemd[1]: sshd@9-10.230.66.218:22-147.75.109.163:35416.service: Deactivated successfully. Jan 15 13:47:41.436608 systemd[1]: session-12.scope: Deactivated successfully. Jan 15 13:47:41.438559 systemd-logind[1493]: Session 12 logged out. Waiting for processes to exit. Jan 15 13:47:41.440186 systemd-logind[1493]: Removed session 12. Jan 15 13:47:41.585872 systemd[1]: Started sshd@10-10.230.66.218:22-147.75.109.163:35430.service - OpenSSH per-connection server daemon (147.75.109.163:35430). Jan 15 13:47:42.477211 sshd[7243]: Accepted publickey for core from 147.75.109.163 port 35430 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:42.480379 sshd[7243]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:42.487977 systemd-logind[1493]: New session 13 of user core. Jan 15 13:47:42.494719 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 15 13:47:43.272932 sshd[7243]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:43.277158 systemd[1]: sshd@10-10.230.66.218:22-147.75.109.163:35430.service: Deactivated successfully. Jan 15 13:47:43.279876 systemd[1]: session-13.scope: Deactivated successfully. Jan 15 13:47:43.281825 systemd-logind[1493]: Session 13 logged out. Waiting for processes to exit. Jan 15 13:47:43.284222 systemd-logind[1493]: Removed session 13. Jan 15 13:47:43.435807 systemd[1]: Started sshd@11-10.230.66.218:22-147.75.109.163:35444.service - OpenSSH per-connection server daemon (147.75.109.163:35444). Jan 15 13:47:44.352047 sshd[7254]: Accepted publickey for core from 147.75.109.163 port 35444 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:44.354534 sshd[7254]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:44.360541 systemd-logind[1493]: New session 14 of user core. Jan 15 13:47:44.368706 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 15 13:47:45.063823 sshd[7254]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:45.069485 systemd-logind[1493]: Session 14 logged out. Waiting for processes to exit. Jan 15 13:47:45.071555 systemd[1]: sshd@11-10.230.66.218:22-147.75.109.163:35444.service: Deactivated successfully. Jan 15 13:47:45.075111 systemd[1]: session-14.scope: Deactivated successfully. Jan 15 13:47:45.076764 systemd-logind[1493]: Removed session 14. Jan 15 13:47:50.223012 systemd[1]: Started sshd@12-10.230.66.218:22-147.75.109.163:42414.service - OpenSSH per-connection server daemon (147.75.109.163:42414). Jan 15 13:47:51.108340 sshd[7280]: Accepted publickey for core from 147.75.109.163 port 42414 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:51.110611 sshd[7280]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:51.118293 systemd-logind[1493]: New session 15 of user core. Jan 15 13:47:51.130735 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 15 13:47:51.828144 sshd[7280]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:51.840735 systemd[1]: sshd@12-10.230.66.218:22-147.75.109.163:42414.service: Deactivated successfully. Jan 15 13:47:51.844037 systemd[1]: session-15.scope: Deactivated successfully. Jan 15 13:47:51.845533 systemd-logind[1493]: Session 15 logged out. Waiting for processes to exit. Jan 15 13:47:51.847427 systemd-logind[1493]: Removed session 15. Jan 15 13:47:51.985360 systemd[1]: Started sshd@13-10.230.66.218:22-147.75.109.163:42430.service - OpenSSH per-connection server daemon (147.75.109.163:42430). Jan 15 13:47:52.903888 sshd[7295]: Accepted publickey for core from 147.75.109.163 port 42430 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:52.906582 sshd[7295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:52.913618 systemd-logind[1493]: New session 16 of user core. Jan 15 13:47:52.919796 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 15 13:47:53.980349 sshd[7295]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:53.988427 systemd[1]: sshd@13-10.230.66.218:22-147.75.109.163:42430.service: Deactivated successfully. Jan 15 13:47:53.991658 systemd[1]: session-16.scope: Deactivated successfully. Jan 15 13:47:53.993356 systemd-logind[1493]: Session 16 logged out. Waiting for processes to exit. Jan 15 13:47:53.995620 systemd-logind[1493]: Removed session 16. Jan 15 13:47:54.137948 systemd[1]: Started sshd@14-10.230.66.218:22-147.75.109.163:42438.service - OpenSSH per-connection server daemon (147.75.109.163:42438). Jan 15 13:47:55.042430 sshd[7306]: Accepted publickey for core from 147.75.109.163 port 42438 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:47:55.045631 sshd[7306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:47:55.055506 systemd-logind[1493]: New session 17 of user core. Jan 15 13:47:55.060642 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 15 13:47:58.924095 sshd[7306]: pam_unix(sshd:session): session closed for user core Jan 15 13:47:58.938832 systemd[1]: sshd@14-10.230.66.218:22-147.75.109.163:42438.service: Deactivated successfully. Jan 15 13:47:58.942482 systemd[1]: session-17.scope: Deactivated successfully. Jan 15 13:47:58.945021 systemd-logind[1493]: Session 17 logged out. Waiting for processes to exit. Jan 15 13:47:58.947750 systemd-logind[1493]: Removed session 17. Jan 15 13:47:59.065405 systemd[1]: Started sshd@15-10.230.66.218:22-147.75.109.163:38608.service - OpenSSH per-connection server daemon (147.75.109.163:38608). Jan 15 13:48:00.030469 sshd[7324]: Accepted publickey for core from 147.75.109.163 port 38608 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:48:00.047960 sshd[7324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:48:00.058649 systemd-logind[1493]: New session 18 of user core. Jan 15 13:48:00.065951 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 15 13:48:01.345275 sshd[7324]: pam_unix(sshd:session): session closed for user core Jan 15 13:48:01.354933 systemd[1]: sshd@15-10.230.66.218:22-147.75.109.163:38608.service: Deactivated successfully. Jan 15 13:48:01.359865 systemd[1]: session-18.scope: Deactivated successfully. Jan 15 13:48:01.363907 systemd-logind[1493]: Session 18 logged out. Waiting for processes to exit. Jan 15 13:48:01.366842 systemd-logind[1493]: Removed session 18. Jan 15 13:48:01.497412 systemd[1]: Started sshd@16-10.230.66.218:22-147.75.109.163:38620.service - OpenSSH per-connection server daemon (147.75.109.163:38620). Jan 15 13:48:02.423611 sshd[7349]: Accepted publickey for core from 147.75.109.163 port 38620 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:48:02.427362 sshd[7349]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:48:02.436076 systemd-logind[1493]: New session 19 of user core. Jan 15 13:48:02.441916 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 15 13:48:03.311840 sshd[7349]: pam_unix(sshd:session): session closed for user core Jan 15 13:48:03.318099 systemd-logind[1493]: Session 19 logged out. Waiting for processes to exit. Jan 15 13:48:03.319207 systemd[1]: sshd@16-10.230.66.218:22-147.75.109.163:38620.service: Deactivated successfully. Jan 15 13:48:03.323061 systemd[1]: session-19.scope: Deactivated successfully. Jan 15 13:48:03.327053 systemd-logind[1493]: Removed session 19. Jan 15 13:48:08.474819 systemd[1]: Started sshd@17-10.230.66.218:22-147.75.109.163:33342.service - OpenSSH per-connection server daemon (147.75.109.163:33342). Jan 15 13:48:09.401860 sshd[7392]: Accepted publickey for core from 147.75.109.163 port 33342 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:48:09.404733 sshd[7392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:48:09.413011 systemd-logind[1493]: New session 20 of user core. Jan 15 13:48:09.417648 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 15 13:48:10.193808 sshd[7392]: pam_unix(sshd:session): session closed for user core Jan 15 13:48:10.199965 systemd[1]: sshd@17-10.230.66.218:22-147.75.109.163:33342.service: Deactivated successfully. Jan 15 13:48:10.202414 systemd[1]: session-20.scope: Deactivated successfully. Jan 15 13:48:10.204089 systemd-logind[1493]: Session 20 logged out. Waiting for processes to exit. Jan 15 13:48:10.205694 systemd-logind[1493]: Removed session 20. Jan 15 13:48:15.353776 systemd[1]: Started sshd@18-10.230.66.218:22-147.75.109.163:33346.service - OpenSSH per-connection server daemon (147.75.109.163:33346). Jan 15 13:48:16.278412 sshd[7405]: Accepted publickey for core from 147.75.109.163 port 33346 ssh2: RSA SHA256:yhnrVaQ6ubHMaiRHrttc+bh72AQMS/h1RjuSsQ1sZRA Jan 15 13:48:16.281175 sshd[7405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 13:48:16.288979 systemd-logind[1493]: New session 21 of user core. Jan 15 13:48:16.295733 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 15 13:48:16.997715 sshd[7405]: pam_unix(sshd:session): session closed for user core Jan 15 13:48:17.003851 systemd[1]: sshd@18-10.230.66.218:22-147.75.109.163:33346.service: Deactivated successfully. Jan 15 13:48:17.007214 systemd[1]: session-21.scope: Deactivated successfully. Jan 15 13:48:17.008177 systemd-logind[1493]: Session 21 logged out. Waiting for processes to exit. Jan 15 13:48:17.009902 systemd-logind[1493]: Removed session 21.