Dec 13 13:29:19.885278 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 13 13:29:19.885324 kernel: Linux version 6.6.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Fri Dec 13 11:56:07 -00 2024 Dec 13 13:29:19.885339 kernel: KASLR enabled Dec 13 13:29:19.885346 kernel: efi: EFI v2.7 by EDK II Dec 13 13:29:19.885352 kernel: efi: SMBIOS 3.0=0x135ed0000 MEMATTR=0x133c6b018 ACPI 2.0=0x132430018 RNG=0x13243e918 MEMRESERVE=0x132357218 Dec 13 13:29:19.885357 kernel: random: crng init done Dec 13 13:29:19.885364 kernel: secureboot: Secure boot disabled Dec 13 13:29:19.885370 kernel: ACPI: Early table checksum verification disabled Dec 13 13:29:19.885376 kernel: ACPI: RSDP 0x0000000132430018 000024 (v02 BOCHS ) Dec 13 13:29:19.885382 kernel: ACPI: XSDT 0x000000013243FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Dec 13 13:29:19.885390 kernel: ACPI: FACP 0x000000013243FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885396 kernel: ACPI: DSDT 0x0000000132437518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885402 kernel: ACPI: APIC 0x000000013243FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885408 kernel: ACPI: PPTT 0x000000013243FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885415 kernel: ACPI: GTDT 0x000000013243D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885423 kernel: ACPI: MCFG 0x000000013243FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885430 kernel: ACPI: SPCR 0x000000013243E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885436 kernel: ACPI: DBG2 0x000000013243E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885442 kernel: ACPI: IORT 0x000000013243E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 13 13:29:19.885448 kernel: ACPI: BGRT 0x000000013243E798 000038 (v01 INTEL EDK2 00000002 01000013) Dec 13 13:29:19.885454 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 13 13:29:19.885460 kernel: NUMA: Failed to initialise from firmware Dec 13 13:29:19.885467 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Dec 13 13:29:19.885472 kernel: NUMA: NODE_DATA [mem 0x13981f800-0x139824fff] Dec 13 13:29:19.885478 kernel: Zone ranges: Dec 13 13:29:19.885484 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 13 13:29:19.885492 kernel: DMA32 empty Dec 13 13:29:19.885514 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Dec 13 13:29:19.885520 kernel: Movable zone start for each node Dec 13 13:29:19.885526 kernel: Early memory node ranges Dec 13 13:29:19.885532 kernel: node 0: [mem 0x0000000040000000-0x000000013243ffff] Dec 13 13:29:19.885575 kernel: node 0: [mem 0x0000000132440000-0x000000013272ffff] Dec 13 13:29:19.885583 kernel: node 0: [mem 0x0000000132730000-0x0000000135bfffff] Dec 13 13:29:19.885589 kernel: node 0: [mem 0x0000000135c00000-0x0000000135fdffff] Dec 13 13:29:19.885595 kernel: node 0: [mem 0x0000000135fe0000-0x0000000139ffffff] Dec 13 13:29:19.885601 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Dec 13 13:29:19.885608 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Dec 13 13:29:19.885619 kernel: psci: probing for conduit method from ACPI. Dec 13 13:29:19.885625 kernel: psci: PSCIv1.1 detected in firmware. Dec 13 13:29:19.885631 kernel: psci: Using standard PSCI v0.2 function IDs Dec 13 13:29:19.885640 kernel: psci: Trusted OS migration not required Dec 13 13:29:19.885646 kernel: psci: SMC Calling Convention v1.1 Dec 13 13:29:19.885653 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 13 13:29:19.885661 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Dec 13 13:29:19.885668 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Dec 13 13:29:19.885675 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 13 13:29:19.885681 kernel: Detected PIPT I-cache on CPU0 Dec 13 13:29:19.885687 kernel: CPU features: detected: GIC system register CPU interface Dec 13 13:29:19.885694 kernel: CPU features: detected: Hardware dirty bit management Dec 13 13:29:19.885700 kernel: CPU features: detected: Spectre-v4 Dec 13 13:29:19.885707 kernel: CPU features: detected: Spectre-BHB Dec 13 13:29:19.885713 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 13 13:29:19.885719 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 13 13:29:19.885726 kernel: CPU features: detected: ARM erratum 1418040 Dec 13 13:29:19.885734 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 13 13:29:19.885740 kernel: alternatives: applying boot alternatives Dec 13 13:29:19.885776 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=c48af8adabdaf1d8e07ceb011d2665929c607ddf2c4d40203b31334d745cc472 Dec 13 13:29:19.885788 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Dec 13 13:29:19.885795 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 13 13:29:19.885801 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 13 13:29:19.885808 kernel: Fallback order for Node 0: 0 Dec 13 13:29:19.885814 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Dec 13 13:29:19.885820 kernel: Policy zone: Normal Dec 13 13:29:19.885827 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 13 13:29:19.885833 kernel: software IO TLB: area num 2. Dec 13 13:29:19.885843 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Dec 13 13:29:19.885850 kernel: Memory: 3881016K/4096000K available (10304K kernel code, 2184K rwdata, 8088K rodata, 39936K init, 897K bss, 214984K reserved, 0K cma-reserved) Dec 13 13:29:19.885857 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 13 13:29:19.885863 kernel: trace event string verifier disabled Dec 13 13:29:19.885869 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 13 13:29:19.885877 kernel: rcu: RCU event tracing is enabled. Dec 13 13:29:19.885883 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 13 13:29:19.885890 kernel: Trampoline variant of Tasks RCU enabled. Dec 13 13:29:19.885896 kernel: Tracing variant of Tasks RCU enabled. Dec 13 13:29:19.885903 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 13 13:29:19.885909 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 13 13:29:19.885918 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 13 13:29:19.885924 kernel: GICv3: 256 SPIs implemented Dec 13 13:29:19.885930 kernel: GICv3: 0 Extended SPIs implemented Dec 13 13:29:19.885937 kernel: Root IRQ handler: gic_handle_irq Dec 13 13:29:19.885943 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 13 13:29:19.885950 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 13 13:29:19.885956 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 13 13:29:19.885963 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Dec 13 13:29:19.886003 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Dec 13 13:29:19.886012 kernel: GICv3: using LPI property table @0x00000001000e0000 Dec 13 13:29:19.886019 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Dec 13 13:29:19.886025 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 13 13:29:19.886046 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 13 13:29:19.886053 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 13 13:29:19.886060 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 13 13:29:19.886067 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 13 13:29:19.886074 kernel: Console: colour dummy device 80x25 Dec 13 13:29:19.886080 kernel: ACPI: Core revision 20230628 Dec 13 13:29:19.886087 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 13 13:29:19.886094 kernel: pid_max: default: 32768 minimum: 301 Dec 13 13:29:19.886101 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Dec 13 13:29:19.886108 kernel: landlock: Up and running. Dec 13 13:29:19.886116 kernel: SELinux: Initializing. Dec 13 13:29:19.886123 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 13 13:29:19.886130 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 13 13:29:19.886137 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 13 13:29:19.886144 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 13 13:29:19.886150 kernel: rcu: Hierarchical SRCU implementation. Dec 13 13:29:19.886157 kernel: rcu: Max phase no-delay instances is 400. Dec 13 13:29:19.886164 kernel: Platform MSI: ITS@0x8080000 domain created Dec 13 13:29:19.886170 kernel: PCI/MSI: ITS@0x8080000 domain created Dec 13 13:29:19.886179 kernel: Remapping and enabling EFI services. Dec 13 13:29:19.886186 kernel: smp: Bringing up secondary CPUs ... Dec 13 13:29:19.886222 kernel: Detected PIPT I-cache on CPU1 Dec 13 13:29:19.886234 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 13 13:29:19.886241 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Dec 13 13:29:19.886247 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 13 13:29:19.886254 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 13 13:29:19.886261 kernel: smp: Brought up 1 node, 2 CPUs Dec 13 13:29:19.886268 kernel: SMP: Total of 2 processors activated. Dec 13 13:29:19.886278 kernel: CPU features: detected: 32-bit EL0 Support Dec 13 13:29:19.886285 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 13 13:29:19.886298 kernel: CPU features: detected: Common not Private translations Dec 13 13:29:19.886307 kernel: CPU features: detected: CRC32 instructions Dec 13 13:29:19.886314 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 13 13:29:19.886321 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 13 13:29:19.886328 kernel: CPU features: detected: LSE atomic instructions Dec 13 13:29:19.886335 kernel: CPU features: detected: Privileged Access Never Dec 13 13:29:19.886342 kernel: CPU features: detected: RAS Extension Support Dec 13 13:29:19.886351 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 13 13:29:19.886358 kernel: CPU: All CPU(s) started at EL1 Dec 13 13:29:19.886365 kernel: alternatives: applying system-wide alternatives Dec 13 13:29:19.886372 kernel: devtmpfs: initialized Dec 13 13:29:19.886379 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 13 13:29:19.886386 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 13 13:29:19.886393 kernel: pinctrl core: initialized pinctrl subsystem Dec 13 13:29:19.886400 kernel: SMBIOS 3.0.0 present. Dec 13 13:29:19.886409 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Dec 13 13:29:19.886416 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 13 13:29:19.886460 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 13 13:29:19.886468 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 13 13:29:19.886475 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 13 13:29:19.886482 kernel: audit: initializing netlink subsys (disabled) Dec 13 13:29:19.886489 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Dec 13 13:29:19.886509 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 13 13:29:19.886517 kernel: cpuidle: using governor menu Dec 13 13:29:19.886528 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 13 13:29:19.886535 kernel: ASID allocator initialised with 32768 entries Dec 13 13:29:19.886542 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 13 13:29:19.886549 kernel: Serial: AMBA PL011 UART driver Dec 13 13:29:19.886556 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 13 13:29:19.886563 kernel: Modules: 0 pages in range for non-PLT usage Dec 13 13:29:19.886570 kernel: Modules: 508880 pages in range for PLT usage Dec 13 13:29:19.886578 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 13 13:29:19.886585 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 13 13:29:19.886593 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 13 13:29:19.886601 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 13 13:29:19.886608 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 13 13:29:19.886615 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 13 13:29:19.886622 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 13 13:29:19.886629 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 13 13:29:19.886636 kernel: ACPI: Added _OSI(Module Device) Dec 13 13:29:19.886643 kernel: ACPI: Added _OSI(Processor Device) Dec 13 13:29:19.886683 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Dec 13 13:29:19.886694 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 13 13:29:19.886701 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 13 13:29:19.886708 kernel: ACPI: Interpreter enabled Dec 13 13:29:19.886715 kernel: ACPI: Using GIC for interrupt routing Dec 13 13:29:19.886722 kernel: ACPI: MCFG table detected, 1 entries Dec 13 13:29:19.886729 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 13 13:29:19.886736 kernel: printk: console [ttyAMA0] enabled Dec 13 13:29:19.886743 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 13 13:29:19.887114 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 13 13:29:19.887218 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 13 13:29:19.887286 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 13 13:29:19.887353 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 13 13:29:19.887429 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 13 13:29:19.887440 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 13 13:29:19.887448 kernel: PCI host bridge to bus 0000:00 Dec 13 13:29:19.887618 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 13 13:29:19.887709 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 13 13:29:19.887771 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 13 13:29:19.887873 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 13 13:29:19.887987 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Dec 13 13:29:19.888103 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Dec 13 13:29:19.888179 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Dec 13 13:29:19.888253 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Dec 13 13:29:19.888348 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.888424 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Dec 13 13:29:19.888528 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.888603 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Dec 13 13:29:19.888680 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.888754 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Dec 13 13:29:19.888831 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.888901 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Dec 13 13:29:19.888975 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.889064 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Dec 13 13:29:19.889142 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.889223 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Dec 13 13:29:19.889304 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.889376 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Dec 13 13:29:19.889561 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.889710 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Dec 13 13:29:19.889796 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Dec 13 13:29:19.889938 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Dec 13 13:29:19.890123 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Dec 13 13:29:19.890228 kernel: pci 0000:00:04.0: reg 0x10: [io 0x8200-0x8207] Dec 13 13:29:19.890377 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Dec 13 13:29:19.890460 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Dec 13 13:29:19.890682 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Dec 13 13:29:19.890863 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Dec 13 13:29:19.890986 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Dec 13 13:29:19.891144 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Dec 13 13:29:19.891320 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Dec 13 13:29:19.891410 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Dec 13 13:29:19.891627 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Dec 13 13:29:19.891821 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Dec 13 13:29:19.891903 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Dec 13 13:29:19.891985 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Dec 13 13:29:19.892135 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Dec 13 13:29:19.892285 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Dec 13 13:29:19.892377 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Dec 13 13:29:19.892450 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Dec 13 13:29:19.892680 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Dec 13 13:29:19.892840 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Dec 13 13:29:19.892916 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Dec 13 13:29:19.893069 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Dec 13 13:29:19.893152 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 13 13:29:19.893280 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 13 13:29:19.893359 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 13 13:29:19.893438 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 13 13:29:19.893579 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 13 13:29:19.893656 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 13 13:29:19.893871 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 13 13:29:19.893999 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 13 13:29:19.894094 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 13 13:29:19.894243 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 13 13:29:19.894330 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 13 13:29:19.894399 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 13 13:29:19.894474 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 13 13:29:19.895327 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 13 13:29:19.895413 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Dec 13 13:29:19.895489 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 13 13:29:19.895586 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 13 13:29:19.895662 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 13 13:29:19.895762 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 13 13:29:19.895842 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Dec 13 13:29:19.895966 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Dec 13 13:29:19.896099 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 13 13:29:19.896175 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 13 13:29:19.896243 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 13 13:29:19.896315 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 13 13:29:19.896388 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 13 13:29:19.896454 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 13 13:29:19.896591 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Dec 13 13:29:19.896665 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Dec 13 13:29:19.896736 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Dec 13 13:29:19.896822 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Dec 13 13:29:19.896894 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Dec 13 13:29:19.896986 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Dec 13 13:29:19.897076 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Dec 13 13:29:19.897155 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Dec 13 13:29:19.897226 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Dec 13 13:29:19.897295 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Dec 13 13:29:19.897365 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Dec 13 13:29:19.897432 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 13 13:29:19.897617 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Dec 13 13:29:19.897693 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 13 13:29:19.897761 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Dec 13 13:29:19.897827 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 13 13:29:19.897965 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Dec 13 13:29:19.898098 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Dec 13 13:29:19.898201 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Dec 13 13:29:19.898284 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Dec 13 13:29:19.898355 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Dec 13 13:29:19.898422 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Dec 13 13:29:19.898503 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Dec 13 13:29:19.900656 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Dec 13 13:29:19.900783 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Dec 13 13:29:19.900866 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Dec 13 13:29:19.900940 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Dec 13 13:29:19.901019 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Dec 13 13:29:19.901117 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Dec 13 13:29:19.901189 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Dec 13 13:29:19.901264 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Dec 13 13:29:19.901334 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Dec 13 13:29:19.901424 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Dec 13 13:29:19.901523 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Dec 13 13:29:19.901608 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Dec 13 13:29:19.901699 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Dec 13 13:29:19.901775 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Dec 13 13:29:19.901844 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Dec 13 13:29:19.901919 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Dec 13 13:29:19.901998 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Dec 13 13:29:19.902086 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Dec 13 13:29:19.902160 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Dec 13 13:29:19.902236 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 13 13:29:19.902305 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 13 13:29:19.902383 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Dec 13 13:29:19.902455 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 13 13:29:19.904687 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Dec 13 13:29:19.904811 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 13 13:29:19.904893 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 13 13:29:19.904959 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Dec 13 13:29:19.905025 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 13 13:29:19.905121 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Dec 13 13:29:19.905195 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Dec 13 13:29:19.905267 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 13 13:29:19.905336 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 13 13:29:19.905406 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Dec 13 13:29:19.905474 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 13 13:29:19.905624 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Dec 13 13:29:19.905734 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 13 13:29:19.905836 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 13 13:29:19.905911 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Dec 13 13:29:19.905980 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 13 13:29:19.906076 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Dec 13 13:29:19.906159 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 13 13:29:19.906329 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 13 13:29:19.906404 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 13 13:29:19.906707 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 13 13:29:19.906802 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Dec 13 13:29:19.906876 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Dec 13 13:29:19.906948 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 13 13:29:19.907019 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 13 13:29:19.907192 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 13 13:29:19.907333 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 13 13:29:19.907425 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Dec 13 13:29:19.907545 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Dec 13 13:29:19.907625 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Dec 13 13:29:19.907698 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 13 13:29:19.907769 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 13 13:29:19.907918 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 13 13:29:19.907999 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 13 13:29:19.908134 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 13 13:29:19.908232 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 13 13:29:19.908303 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 13 13:29:19.908371 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 13 13:29:19.908444 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 13 13:29:19.908548 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Dec 13 13:29:19.908626 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Dec 13 13:29:19.908695 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 13 13:29:19.908768 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 13 13:29:19.908831 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 13 13:29:19.908892 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 13 13:29:19.908969 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 13 13:29:19.909049 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 13 13:29:19.909121 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 13 13:29:19.909198 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Dec 13 13:29:19.909262 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 13 13:29:19.909325 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 13 13:29:19.909398 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Dec 13 13:29:19.909463 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 13 13:29:19.909576 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 13 13:29:19.909656 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Dec 13 13:29:19.909721 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 13 13:29:19.909799 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 13 13:29:19.909871 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Dec 13 13:29:19.909935 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 13 13:29:19.909998 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 13 13:29:19.910121 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Dec 13 13:29:19.910196 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 13 13:29:19.910260 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 13 13:29:19.910332 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Dec 13 13:29:19.910399 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 13 13:29:19.910466 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 13 13:29:19.910641 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Dec 13 13:29:19.910710 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 13 13:29:19.910773 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 13 13:29:19.910842 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Dec 13 13:29:19.910904 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 13 13:29:19.910970 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 13 13:29:19.910980 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 13 13:29:19.910988 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 13 13:29:19.910996 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 13 13:29:19.911003 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 13 13:29:19.911011 kernel: iommu: Default domain type: Translated Dec 13 13:29:19.911018 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 13 13:29:19.911037 kernel: efivars: Registered efivars operations Dec 13 13:29:19.911047 kernel: vgaarb: loaded Dec 13 13:29:19.911058 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 13 13:29:19.911066 kernel: VFS: Disk quotas dquot_6.6.0 Dec 13 13:29:19.911074 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 13 13:29:19.911081 kernel: pnp: PnP ACPI init Dec 13 13:29:19.911163 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 13 13:29:19.911174 kernel: pnp: PnP ACPI: found 1 devices Dec 13 13:29:19.911182 kernel: NET: Registered PF_INET protocol family Dec 13 13:29:19.911189 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 13 13:29:19.911197 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 13 13:29:19.911207 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 13 13:29:19.911215 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 13 13:29:19.911222 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 13 13:29:19.911230 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 13 13:29:19.911237 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 13 13:29:19.911245 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 13 13:29:19.911253 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 13 13:29:19.911330 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 13 13:29:19.911344 kernel: PCI: CLS 0 bytes, default 64 Dec 13 13:29:19.911351 kernel: kvm [1]: HYP mode not available Dec 13 13:29:19.911359 kernel: Initialise system trusted keyrings Dec 13 13:29:19.911366 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 13 13:29:19.911373 kernel: Key type asymmetric registered Dec 13 13:29:19.911381 kernel: Asymmetric key parser 'x509' registered Dec 13 13:29:19.911388 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 13 13:29:19.911396 kernel: io scheduler mq-deadline registered Dec 13 13:29:19.911403 kernel: io scheduler kyber registered Dec 13 13:29:19.911412 kernel: io scheduler bfq registered Dec 13 13:29:19.911421 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 13 13:29:19.911501 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Dec 13 13:29:19.911593 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Dec 13 13:29:19.911663 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.911734 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Dec 13 13:29:19.911803 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Dec 13 13:29:19.911876 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.911948 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Dec 13 13:29:19.912015 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Dec 13 13:29:19.912095 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.912167 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Dec 13 13:29:19.912235 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Dec 13 13:29:19.912308 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.912380 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Dec 13 13:29:19.912447 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Dec 13 13:29:19.914636 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.914750 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Dec 13 13:29:19.914821 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Dec 13 13:29:19.914902 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.914974 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Dec 13 13:29:19.915090 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Dec 13 13:29:19.915166 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.915241 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Dec 13 13:29:19.915312 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Dec 13 13:29:19.915387 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.915398 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 13 13:29:19.915469 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Dec 13 13:29:19.916463 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Dec 13 13:29:19.916608 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 13 13:29:19.916621 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 13 13:29:19.916629 kernel: ACPI: button: Power Button [PWRB] Dec 13 13:29:19.916644 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 13 13:29:19.916721 kernel: virtio-pci 0000:03:00.0: enabling device (0000 -> 0002) Dec 13 13:29:19.916797 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 13 13:29:19.916872 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Dec 13 13:29:19.916883 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 13 13:29:19.916891 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 13 13:29:19.916960 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Dec 13 13:29:19.916971 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Dec 13 13:29:19.916981 kernel: thunder_xcv, ver 1.0 Dec 13 13:29:19.916989 kernel: thunder_bgx, ver 1.0 Dec 13 13:29:19.916996 kernel: nicpf, ver 1.0 Dec 13 13:29:19.917003 kernel: nicvf, ver 1.0 Dec 13 13:29:19.917103 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 13 13:29:19.917170 kernel: rtc-efi rtc-efi.0: setting system clock to 2024-12-13T13:29:19 UTC (1734096559) Dec 13 13:29:19.917180 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 13 13:29:19.917188 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Dec 13 13:29:19.917198 kernel: watchdog: Delayed init of the lockup detector failed: -19 Dec 13 13:29:19.917206 kernel: watchdog: Hard watchdog permanently disabled Dec 13 13:29:19.917213 kernel: NET: Registered PF_INET6 protocol family Dec 13 13:29:19.917221 kernel: Segment Routing with IPv6 Dec 13 13:29:19.917229 kernel: In-situ OAM (IOAM) with IPv6 Dec 13 13:29:19.917237 kernel: NET: Registered PF_PACKET protocol family Dec 13 13:29:19.917245 kernel: Key type dns_resolver registered Dec 13 13:29:19.917255 kernel: registered taskstats version 1 Dec 13 13:29:19.917264 kernel: Loading compiled-in X.509 certificates Dec 13 13:29:19.917274 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.65-flatcar: 752b3e36c6039904ea643ccad2b3f5f3cb4ebf78' Dec 13 13:29:19.917285 kernel: Key type .fscrypt registered Dec 13 13:29:19.917292 kernel: Key type fscrypt-provisioning registered Dec 13 13:29:19.917300 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 13 13:29:19.917308 kernel: ima: Allocated hash algorithm: sha1 Dec 13 13:29:19.917316 kernel: ima: No architecture policies found Dec 13 13:29:19.917324 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 13 13:29:19.917331 kernel: clk: Disabling unused clocks Dec 13 13:29:19.917339 kernel: Freeing unused kernel memory: 39936K Dec 13 13:29:19.917348 kernel: Run /init as init process Dec 13 13:29:19.917355 kernel: with arguments: Dec 13 13:29:19.917363 kernel: /init Dec 13 13:29:19.917370 kernel: with environment: Dec 13 13:29:19.917377 kernel: HOME=/ Dec 13 13:29:19.917384 kernel: TERM=linux Dec 13 13:29:19.917391 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Dec 13 13:29:19.917401 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Dec 13 13:29:19.917413 systemd[1]: Detected virtualization kvm. Dec 13 13:29:19.917421 systemd[1]: Detected architecture arm64. Dec 13 13:29:19.917429 systemd[1]: Running in initrd. Dec 13 13:29:19.917437 systemd[1]: No hostname configured, using default hostname. Dec 13 13:29:19.917445 systemd[1]: Hostname set to . Dec 13 13:29:19.917453 systemd[1]: Initializing machine ID from VM UUID. Dec 13 13:29:19.917460 systemd[1]: Queued start job for default target initrd.target. Dec 13 13:29:19.917468 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 13 13:29:19.917478 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 13 13:29:19.917486 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 13 13:29:19.919553 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 13 13:29:19.919574 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 13 13:29:19.919584 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 13 13:29:19.919594 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 13 13:29:19.919602 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 13 13:29:19.919617 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 13 13:29:19.919625 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 13 13:29:19.919633 systemd[1]: Reached target paths.target - Path Units. Dec 13 13:29:19.919642 systemd[1]: Reached target slices.target - Slice Units. Dec 13 13:29:19.919650 systemd[1]: Reached target swap.target - Swaps. Dec 13 13:29:19.919658 systemd[1]: Reached target timers.target - Timer Units. Dec 13 13:29:19.919668 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 13 13:29:19.919679 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 13 13:29:19.919690 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 13 13:29:19.919699 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Dec 13 13:29:19.919710 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 13 13:29:19.919718 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 13 13:29:19.919726 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 13 13:29:19.919734 systemd[1]: Reached target sockets.target - Socket Units. Dec 13 13:29:19.919742 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 13 13:29:19.919750 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 13 13:29:19.919758 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 13 13:29:19.919767 systemd[1]: Starting systemd-fsck-usr.service... Dec 13 13:29:19.919775 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 13 13:29:19.919783 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 13 13:29:19.919791 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 13 13:29:19.919799 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 13 13:29:19.919806 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 13 13:29:19.919841 systemd-journald[237]: Collecting audit messages is disabled. Dec 13 13:29:19.919866 systemd[1]: Finished systemd-fsck-usr.service. Dec 13 13:29:19.919875 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 13 13:29:19.919885 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 13 13:29:19.919893 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 13 13:29:19.919903 systemd-journald[237]: Journal started Dec 13 13:29:19.919925 systemd-journald[237]: Runtime Journal (/run/log/journal/9b8aa4c4d49d483d8d3d4e9b09cd7730) is 8.0M, max 76.5M, 68.5M free. Dec 13 13:29:19.897091 systemd-modules-load[238]: Inserted module 'overlay' Dec 13 13:29:19.922180 systemd-modules-load[238]: Inserted module 'br_netfilter' Dec 13 13:29:19.923024 systemd[1]: Started systemd-journald.service - Journal Service. Dec 13 13:29:19.923094 kernel: Bridge firewalling registered Dec 13 13:29:19.923340 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 13 13:29:19.924234 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 13 13:29:19.932811 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 13 13:29:19.936717 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 13 13:29:19.937946 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 13 13:29:19.942809 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 13 13:29:19.958203 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 13 13:29:19.962612 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 13 13:29:19.969904 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 13 13:29:19.976806 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 13 13:29:19.978804 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 13 13:29:19.988740 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 13 13:29:20.018669 systemd-resolved[272]: Positive Trust Anchors: Dec 13 13:29:20.018690 systemd-resolved[272]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 13 13:29:20.018722 systemd-resolved[272]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 13 13:29:20.024701 systemd-resolved[272]: Defaulting to hostname 'linux'. Dec 13 13:29:20.026069 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 13 13:29:20.027387 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 13 13:29:20.029953 dracut-cmdline[275]: dracut-dracut-053 Dec 13 13:29:20.033628 dracut-cmdline[275]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=c48af8adabdaf1d8e07ceb011d2665929c607ddf2c4d40203b31334d745cc472 Dec 13 13:29:20.123544 kernel: SCSI subsystem initialized Dec 13 13:29:20.128606 kernel: Loading iSCSI transport class v2.0-870. Dec 13 13:29:20.137588 kernel: iscsi: registered transport (tcp) Dec 13 13:29:20.152551 kernel: iscsi: registered transport (qla4xxx) Dec 13 13:29:20.152709 kernel: QLogic iSCSI HBA Driver Dec 13 13:29:20.209698 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 13 13:29:20.215756 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 13 13:29:20.248955 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 13 13:29:20.249037 kernel: device-mapper: uevent: version 1.0.3 Dec 13 13:29:20.249051 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Dec 13 13:29:20.303584 kernel: raid6: neonx8 gen() 15452 MB/s Dec 13 13:29:20.320549 kernel: raid6: neonx4 gen() 15339 MB/s Dec 13 13:29:20.337560 kernel: raid6: neonx2 gen() 12944 MB/s Dec 13 13:29:20.354555 kernel: raid6: neonx1 gen() 10148 MB/s Dec 13 13:29:20.371547 kernel: raid6: int64x8 gen() 6711 MB/s Dec 13 13:29:20.388557 kernel: raid6: int64x4 gen() 7248 MB/s Dec 13 13:29:20.406302 kernel: raid6: int64x2 gen() 6002 MB/s Dec 13 13:29:20.422618 kernel: raid6: int64x1 gen() 4961 MB/s Dec 13 13:29:20.422707 kernel: raid6: using algorithm neonx8 gen() 15452 MB/s Dec 13 13:29:20.439557 kernel: raid6: .... xor() 11660 MB/s, rmw enabled Dec 13 13:29:20.439633 kernel: raid6: using neon recovery algorithm Dec 13 13:29:20.444661 kernel: xor: measuring software checksum speed Dec 13 13:29:20.444803 kernel: 8regs : 21601 MB/sec Dec 13 13:29:20.444820 kernel: 32regs : 21693 MB/sec Dec 13 13:29:20.445846 kernel: arm64_neon : 27974 MB/sec Dec 13 13:29:20.445903 kernel: xor: using function: arm64_neon (27974 MB/sec) Dec 13 13:29:20.498555 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 13 13:29:20.516688 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 13 13:29:20.523755 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 13 13:29:20.537257 systemd-udevd[456]: Using default interface naming scheme 'v255'. Dec 13 13:29:20.540949 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 13 13:29:20.549719 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 13 13:29:20.570084 dracut-pre-trigger[464]: rd.md=0: removing MD RAID activation Dec 13 13:29:20.610714 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 13 13:29:20.617769 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 13 13:29:20.675719 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 13 13:29:20.680995 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 13 13:29:20.712838 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 13 13:29:20.715513 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 13 13:29:20.716206 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 13 13:29:20.720040 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 13 13:29:20.728995 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 13 13:29:20.757809 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 13 13:29:20.829399 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 13 13:29:20.830492 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 13 13:29:20.835072 kernel: ACPI: bus type USB registered Dec 13 13:29:20.835097 kernel: usbcore: registered new interface driver usbfs Dec 13 13:29:20.835107 kernel: usbcore: registered new interface driver hub Dec 13 13:29:20.831524 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 13 13:29:20.833140 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 13 13:29:20.833427 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 13 13:29:20.837253 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 13 13:29:20.843849 kernel: scsi host0: Virtio SCSI HBA Dec 13 13:29:20.846575 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 13 13:29:20.847259 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 13 13:29:20.847590 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 13 13:29:20.851744 kernel: usbcore: registered new device driver usb Dec 13 13:29:20.866905 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 13 13:29:20.878243 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 13 13:29:20.897597 kernel: sr 0:0:0:0: Power-on or device reset occurred Dec 13 13:29:20.904572 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Dec 13 13:29:20.904793 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 13 13:29:20.904810 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 13 13:29:20.900755 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 13 13:29:20.911527 kernel: sd 0:0:0:1: Power-on or device reset occurred Dec 13 13:29:20.924948 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 13 13:29:20.925166 kernel: sd 0:0:0:1: [sda] Write Protect is off Dec 13 13:29:20.925266 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Dec 13 13:29:20.925360 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 13 13:29:20.925461 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 13 13:29:20.925615 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 13 13:29:20.925709 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 13 13:29:20.925799 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 13 13:29:20.925895 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 13 13:29:20.926029 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 13 13:29:20.926135 kernel: hub 1-0:1.0: USB hub found Dec 13 13:29:20.926314 kernel: hub 1-0:1.0: 4 ports detected Dec 13 13:29:20.926411 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 13 13:29:20.926797 kernel: hub 2-0:1.0: USB hub found Dec 13 13:29:20.926915 kernel: hub 2-0:1.0: 4 ports detected Dec 13 13:29:20.927002 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 13 13:29:20.927030 kernel: GPT:17805311 != 80003071 Dec 13 13:29:20.927043 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 13 13:29:20.927059 kernel: GPT:17805311 != 80003071 Dec 13 13:29:20.927068 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 13 13:29:20.927078 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 13 13:29:20.927087 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Dec 13 13:29:20.984605 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (528) Dec 13 13:29:20.991530 kernel: BTRFS: device fsid 47b12626-f7d3-4179-9720-ca262eb4c614 devid 1 transid 38 /dev/sda3 scanned by (udev-worker) (504) Dec 13 13:29:20.998136 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 13 13:29:21.007227 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 13 13:29:21.015974 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 13 13:29:21.021692 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 13 13:29:21.024683 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Dec 13 13:29:21.031810 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 13 13:29:21.043565 disk-uuid[574]: Primary Header is updated. Dec 13 13:29:21.043565 disk-uuid[574]: Secondary Entries is updated. Dec 13 13:29:21.043565 disk-uuid[574]: Secondary Header is updated. Dec 13 13:29:21.059916 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 13 13:29:21.162514 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 13 13:29:21.403567 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 13 13:29:21.542098 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 13 13:29:21.542174 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 13 13:29:21.542543 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 13 13:29:21.595517 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 13 13:29:21.595773 kernel: usbcore: registered new interface driver usbhid Dec 13 13:29:21.596627 kernel: usbhid: USB HID core driver Dec 13 13:29:22.082680 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 13 13:29:22.084206 disk-uuid[575]: The operation has completed successfully. Dec 13 13:29:22.146685 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 13 13:29:22.146802 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 13 13:29:22.163879 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 13 13:29:22.169738 sh[589]: Success Dec 13 13:29:22.187586 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Dec 13 13:29:22.250864 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 13 13:29:22.260684 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 13 13:29:22.263586 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 13 13:29:22.291562 kernel: BTRFS info (device dm-0): first mount of filesystem 47b12626-f7d3-4179-9720-ca262eb4c614 Dec 13 13:29:22.291640 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 13 13:29:22.292883 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Dec 13 13:29:22.292923 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 13 13:29:22.292939 kernel: BTRFS info (device dm-0): using free space tree Dec 13 13:29:22.300556 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 13 13:29:22.302696 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 13 13:29:22.304753 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 13 13:29:22.310734 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 13 13:29:22.316229 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 13 13:29:22.329263 kernel: BTRFS info (device sda6): first mount of filesystem d0a3d620-8ab2-45d8-a26c-bb488ffd59f2 Dec 13 13:29:22.329369 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 13 13:29:22.330183 kernel: BTRFS info (device sda6): using free space tree Dec 13 13:29:22.336536 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 13 13:29:22.336612 kernel: BTRFS info (device sda6): auto enabling async discard Dec 13 13:29:22.350443 systemd[1]: mnt-oem.mount: Deactivated successfully. Dec 13 13:29:22.351198 kernel: BTRFS info (device sda6): last unmount of filesystem d0a3d620-8ab2-45d8-a26c-bb488ffd59f2 Dec 13 13:29:22.362091 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 13 13:29:22.369856 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 13 13:29:22.441072 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 13 13:29:22.449978 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 13 13:29:22.477859 systemd-networkd[773]: lo: Link UP Dec 13 13:29:22.477875 systemd-networkd[773]: lo: Gained carrier Dec 13 13:29:22.480305 systemd-networkd[773]: Enumeration completed Dec 13 13:29:22.480441 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 13 13:29:22.481379 systemd[1]: Reached target network.target - Network. Dec 13 13:29:22.482693 systemd-networkd[773]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:22.482700 systemd-networkd[773]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 13 13:29:22.483774 systemd-networkd[773]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:22.483777 systemd-networkd[773]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 13 13:29:22.484602 systemd-networkd[773]: eth0: Link UP Dec 13 13:29:22.484606 systemd-networkd[773]: eth0: Gained carrier Dec 13 13:29:22.484614 systemd-networkd[773]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:22.491855 systemd-networkd[773]: eth1: Link UP Dec 13 13:29:22.491909 systemd-networkd[773]: eth1: Gained carrier Dec 13 13:29:22.491922 systemd-networkd[773]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:22.508386 ignition[685]: Ignition 2.20.0 Dec 13 13:29:22.508398 ignition[685]: Stage: fetch-offline Dec 13 13:29:22.508444 ignition[685]: no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:22.508453 ignition[685]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:22.508786 ignition[685]: parsed url from cmdline: "" Dec 13 13:29:22.511787 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 13 13:29:22.508790 ignition[685]: no config URL provided Dec 13 13:29:22.508796 ignition[685]: reading system config file "/usr/lib/ignition/user.ign" Dec 13 13:29:22.508805 ignition[685]: no config at "/usr/lib/ignition/user.ign" Dec 13 13:29:22.508811 ignition[685]: failed to fetch config: resource requires networking Dec 13 13:29:22.509064 ignition[685]: Ignition finished successfully Dec 13 13:29:22.519886 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 13 13:29:22.523701 systemd-networkd[773]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Dec 13 13:29:22.533902 ignition[783]: Ignition 2.20.0 Dec 13 13:29:22.533912 ignition[783]: Stage: fetch Dec 13 13:29:22.534102 ignition[783]: no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:22.534115 ignition[783]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:22.534211 ignition[783]: parsed url from cmdline: "" Dec 13 13:29:22.534214 ignition[783]: no config URL provided Dec 13 13:29:22.534219 ignition[783]: reading system config file "/usr/lib/ignition/user.ign" Dec 13 13:29:22.534227 ignition[783]: no config at "/usr/lib/ignition/user.ign" Dec 13 13:29:22.534311 ignition[783]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 13 13:29:22.535134 ignition[783]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Dec 13 13:29:22.556609 systemd-networkd[773]: eth0: DHCPv4 address 188.245.225.138/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 13 13:29:22.735315 ignition[783]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Dec 13 13:29:22.741654 ignition[783]: GET result: OK Dec 13 13:29:22.741789 ignition[783]: parsing config with SHA512: 0313729bf6a29a34ecd8136344aeb71c5b2ef25496d0f55841e0213e553ef3766ed9e072d256fc7775e088fe75503644018bca5606dc477e082bc8c521ead6d1 Dec 13 13:29:22.747780 unknown[783]: fetched base config from "system" Dec 13 13:29:22.747790 unknown[783]: fetched base config from "system" Dec 13 13:29:22.748190 ignition[783]: fetch: fetch complete Dec 13 13:29:22.747795 unknown[783]: fetched user config from "hetzner" Dec 13 13:29:22.748196 ignition[783]: fetch: fetch passed Dec 13 13:29:22.750482 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 13 13:29:22.748244 ignition[783]: Ignition finished successfully Dec 13 13:29:22.761878 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 13 13:29:22.781232 ignition[790]: Ignition 2.20.0 Dec 13 13:29:22.781242 ignition[790]: Stage: kargs Dec 13 13:29:22.781442 ignition[790]: no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:22.781452 ignition[790]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:22.784637 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 13 13:29:22.782513 ignition[790]: kargs: kargs passed Dec 13 13:29:22.782577 ignition[790]: Ignition finished successfully Dec 13 13:29:22.790032 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 13 13:29:22.806822 ignition[797]: Ignition 2.20.0 Dec 13 13:29:22.807464 ignition[797]: Stage: disks Dec 13 13:29:22.807718 ignition[797]: no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:22.807731 ignition[797]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:22.810350 ignition[797]: disks: disks passed Dec 13 13:29:22.810881 ignition[797]: Ignition finished successfully Dec 13 13:29:22.813617 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 13 13:29:22.814836 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 13 13:29:22.815904 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 13 13:29:22.817083 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 13 13:29:22.818372 systemd[1]: Reached target sysinit.target - System Initialization. Dec 13 13:29:22.819618 systemd[1]: Reached target basic.target - Basic System. Dec 13 13:29:22.831852 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 13 13:29:22.852812 systemd-fsck[806]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Dec 13 13:29:22.857266 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 13 13:29:22.865787 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 13 13:29:22.918689 kernel: EXT4-fs (sda9): mounted filesystem 0aa4851d-a2ba-4d04-90b3-5d00bf608ecc r/w with ordered data mode. Quota mode: none. Dec 13 13:29:22.920279 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 13 13:29:22.921643 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 13 13:29:22.928927 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 13 13:29:22.931715 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 13 13:29:22.937829 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 13 13:29:22.941604 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 13 13:29:22.943650 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 13 13:29:22.951536 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (815) Dec 13 13:29:22.953692 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 13 13:29:22.958384 kernel: BTRFS info (device sda6): first mount of filesystem d0a3d620-8ab2-45d8-a26c-bb488ffd59f2 Dec 13 13:29:22.958409 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 13 13:29:22.958429 kernel: BTRFS info (device sda6): using free space tree Dec 13 13:29:22.963382 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 13 13:29:22.963445 kernel: BTRFS info (device sda6): auto enabling async discard Dec 13 13:29:22.964739 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 13 13:29:22.966594 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 13 13:29:23.014598 coreos-metadata[817]: Dec 13 13:29:23.014 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 13 13:29:23.016881 coreos-metadata[817]: Dec 13 13:29:23.016 INFO Fetch successful Dec 13 13:29:23.019088 coreos-metadata[817]: Dec 13 13:29:23.017 INFO wrote hostname ci-4186-0-0-4-8ed7fad560 to /sysroot/etc/hostname Dec 13 13:29:23.020081 initrd-setup-root[842]: cut: /sysroot/etc/passwd: No such file or directory Dec 13 13:29:23.022279 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 13 13:29:23.028547 initrd-setup-root[850]: cut: /sysroot/etc/group: No such file or directory Dec 13 13:29:23.034941 initrd-setup-root[857]: cut: /sysroot/etc/shadow: No such file or directory Dec 13 13:29:23.041324 initrd-setup-root[864]: cut: /sysroot/etc/gshadow: No such file or directory Dec 13 13:29:23.156966 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 13 13:29:23.161690 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 13 13:29:23.163706 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 13 13:29:23.174539 kernel: BTRFS info (device sda6): last unmount of filesystem d0a3d620-8ab2-45d8-a26c-bb488ffd59f2 Dec 13 13:29:23.199223 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 13 13:29:23.203829 ignition[931]: INFO : Ignition 2.20.0 Dec 13 13:29:23.203829 ignition[931]: INFO : Stage: mount Dec 13 13:29:23.205015 ignition[931]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:23.205015 ignition[931]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:23.207197 ignition[931]: INFO : mount: mount passed Dec 13 13:29:23.207197 ignition[931]: INFO : Ignition finished successfully Dec 13 13:29:23.207362 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 13 13:29:23.214669 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 13 13:29:23.290278 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 13 13:29:23.299424 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 13 13:29:23.312606 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (943) Dec 13 13:29:23.315652 kernel: BTRFS info (device sda6): first mount of filesystem d0a3d620-8ab2-45d8-a26c-bb488ffd59f2 Dec 13 13:29:23.315728 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 13 13:29:23.315740 kernel: BTRFS info (device sda6): using free space tree Dec 13 13:29:23.321677 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 13 13:29:23.321766 kernel: BTRFS info (device sda6): auto enabling async discard Dec 13 13:29:23.325323 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 13 13:29:23.349556 ignition[960]: INFO : Ignition 2.20.0 Dec 13 13:29:23.349556 ignition[960]: INFO : Stage: files Dec 13 13:29:23.351003 ignition[960]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:23.351003 ignition[960]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:23.351003 ignition[960]: DEBUG : files: compiled without relabeling support, skipping Dec 13 13:29:23.354577 ignition[960]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 13 13:29:23.354577 ignition[960]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 13 13:29:23.357759 ignition[960]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 13 13:29:23.357759 ignition[960]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 13 13:29:23.361635 ignition[960]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 13 13:29:23.358968 unknown[960]: wrote ssh authorized keys file for user: core Dec 13 13:29:23.364679 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Dec 13 13:29:23.364679 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Dec 13 13:29:23.476352 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 13 13:29:23.766152 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Dec 13 13:29:23.766152 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 13 13:29:23.768647 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 13 13:29:23.768647 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 13 13:29:23.768647 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 13 13:29:23.768647 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 13 13:29:23.768647 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 13 13:29:23.768647 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw" Dec 13 13:29:23.775552 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-arm64.raw: attempt #1 Dec 13 13:29:24.150787 systemd-networkd[773]: eth1: Gained IPv6LL Dec 13 13:29:24.215111 systemd-networkd[773]: eth0: Gained IPv6LL Dec 13 13:29:24.364555 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 13 13:29:24.697036 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-arm64.raw" Dec 13 13:29:24.697036 ignition[960]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 13 13:29:24.700646 ignition[960]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 13 13:29:24.700646 ignition[960]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 13 13:29:24.700646 ignition[960]: INFO : files: files passed Dec 13 13:29:24.700646 ignition[960]: INFO : Ignition finished successfully Dec 13 13:29:24.700584 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 13 13:29:24.706751 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 13 13:29:24.711685 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 13 13:29:24.717811 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 13 13:29:24.717931 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 13 13:29:24.734807 initrd-setup-root-after-ignition[988]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 13 13:29:24.734807 initrd-setup-root-after-ignition[988]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 13 13:29:24.737857 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 13 13:29:24.738905 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 13 13:29:24.740733 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 13 13:29:24.746717 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 13 13:29:24.778654 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 13 13:29:24.779606 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 13 13:29:24.780853 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 13 13:29:24.781689 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 13 13:29:24.782394 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 13 13:29:24.791309 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 13 13:29:24.805141 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 13 13:29:24.814718 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 13 13:29:24.826568 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 13 13:29:24.827936 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 13 13:29:24.828741 systemd[1]: Stopped target timers.target - Timer Units. Dec 13 13:29:24.829767 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 13 13:29:24.829947 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 13 13:29:24.831226 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 13 13:29:24.832370 systemd[1]: Stopped target basic.target - Basic System. Dec 13 13:29:24.833276 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 13 13:29:24.834320 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 13 13:29:24.835333 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 13 13:29:24.836349 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 13 13:29:24.837362 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 13 13:29:24.838517 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 13 13:29:24.839407 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 13 13:29:24.840299 systemd[1]: Stopped target swap.target - Swaps. Dec 13 13:29:24.841076 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 13 13:29:24.841233 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 13 13:29:24.842362 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 13 13:29:24.843395 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 13 13:29:24.844346 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 13 13:29:24.844449 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 13 13:29:24.845430 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 13 13:29:24.845607 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 13 13:29:24.847071 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 13 13:29:24.847228 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 13 13:29:24.848208 systemd[1]: ignition-files.service: Deactivated successfully. Dec 13 13:29:24.848354 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 13 13:29:24.849154 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 13 13:29:24.849284 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 13 13:29:24.861406 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 13 13:29:24.866192 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 13 13:29:24.867464 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 13 13:29:24.867866 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 13 13:29:24.873742 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 13 13:29:24.873907 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 13 13:29:24.882581 ignition[1012]: INFO : Ignition 2.20.0 Dec 13 13:29:24.882581 ignition[1012]: INFO : Stage: umount Dec 13 13:29:24.882581 ignition[1012]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 13 13:29:24.882581 ignition[1012]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 13 13:29:24.886431 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 13 13:29:24.888805 ignition[1012]: INFO : umount: umount passed Dec 13 13:29:24.888805 ignition[1012]: INFO : Ignition finished successfully Dec 13 13:29:24.886611 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 13 13:29:24.891731 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 13 13:29:24.891866 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 13 13:29:24.893217 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 13 13:29:24.893265 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 13 13:29:24.893937 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 13 13:29:24.893992 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 13 13:29:24.894929 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 13 13:29:24.894983 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 13 13:29:24.898641 systemd[1]: Stopped target network.target - Network. Dec 13 13:29:24.900110 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 13 13:29:24.900189 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 13 13:29:24.903234 systemd[1]: Stopped target paths.target - Path Units. Dec 13 13:29:24.903872 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 13 13:29:24.905723 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 13 13:29:24.908204 systemd[1]: Stopped target slices.target - Slice Units. Dec 13 13:29:24.909900 systemd[1]: Stopped target sockets.target - Socket Units. Dec 13 13:29:24.910866 systemd[1]: iscsid.socket: Deactivated successfully. Dec 13 13:29:24.910921 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 13 13:29:24.913607 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 13 13:29:24.913680 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 13 13:29:24.914288 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 13 13:29:24.914339 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 13 13:29:24.914947 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 13 13:29:24.915019 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 13 13:29:24.918866 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 13 13:29:24.921126 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 13 13:29:24.921554 systemd-networkd[773]: eth0: DHCPv6 lease lost Dec 13 13:29:24.924366 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 13 13:29:24.924572 systemd-networkd[773]: eth1: DHCPv6 lease lost Dec 13 13:29:24.929200 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 13 13:29:24.929312 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 13 13:29:24.932742 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 13 13:29:24.932839 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 13 13:29:24.939262 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 13 13:29:24.939323 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 13 13:29:24.948665 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 13 13:29:24.949324 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 13 13:29:24.949395 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 13 13:29:24.953165 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 13 13:29:24.953265 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 13 13:29:24.955121 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 13 13:29:24.955181 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 13 13:29:24.956088 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 13 13:29:24.956135 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 13 13:29:24.956883 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 13 13:29:24.959152 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 13 13:29:24.959337 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 13 13:29:24.969887 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 13 13:29:24.970009 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 13 13:29:24.985558 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 13 13:29:24.987156 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 13 13:29:24.990127 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 13 13:29:24.990281 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 13 13:29:24.991588 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 13 13:29:24.991626 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 13 13:29:24.992540 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 13 13:29:24.992591 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 13 13:29:24.994153 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 13 13:29:24.994194 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 13 13:29:24.995827 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 13 13:29:24.995910 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 13 13:29:25.005781 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 13 13:29:25.007091 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 13 13:29:25.007179 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 13 13:29:25.008324 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 13 13:29:25.008382 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 13 13:29:25.009284 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 13 13:29:25.009330 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 13 13:29:25.010079 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 13 13:29:25.010117 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 13 13:29:25.013791 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 13 13:29:25.013901 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 13 13:29:25.018342 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 13 13:29:25.018456 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 13 13:29:25.019799 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 13 13:29:25.029236 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 13 13:29:25.038078 systemd[1]: Switching root. Dec 13 13:29:25.074671 systemd-journald[237]: Journal stopped Dec 13 13:29:25.969744 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Dec 13 13:29:25.969830 kernel: SELinux: policy capability network_peer_controls=1 Dec 13 13:29:25.969843 kernel: SELinux: policy capability open_perms=1 Dec 13 13:29:25.969852 kernel: SELinux: policy capability extended_socket_class=1 Dec 13 13:29:25.969861 kernel: SELinux: policy capability always_check_network=0 Dec 13 13:29:25.969875 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 13 13:29:25.969889 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 13 13:29:25.969898 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 13 13:29:25.969906 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 13 13:29:25.969915 kernel: audit: type=1403 audit(1734096565.210:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 13 13:29:25.969925 systemd[1]: Successfully loaded SELinux policy in 34.591ms. Dec 13 13:29:25.969962 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 11.158ms. Dec 13 13:29:25.969976 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Dec 13 13:29:25.969986 systemd[1]: Detected virtualization kvm. Dec 13 13:29:25.970003 systemd[1]: Detected architecture arm64. Dec 13 13:29:25.970014 systemd[1]: Detected first boot. Dec 13 13:29:25.970023 systemd[1]: Hostname set to . Dec 13 13:29:25.970033 systemd[1]: Initializing machine ID from VM UUID. Dec 13 13:29:25.970043 zram_generator::config[1054]: No configuration found. Dec 13 13:29:25.970054 systemd[1]: Populated /etc with preset unit settings. Dec 13 13:29:25.970064 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 13 13:29:25.970076 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 13 13:29:25.970087 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 13 13:29:25.970097 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 13 13:29:25.970107 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 13 13:29:25.970117 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 13 13:29:25.970127 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 13 13:29:25.970137 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 13 13:29:25.970150 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 13 13:29:25.970160 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 13 13:29:25.970172 systemd[1]: Created slice user.slice - User and Session Slice. Dec 13 13:29:25.970182 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 13 13:29:25.970192 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 13 13:29:25.970202 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 13 13:29:25.970212 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 13 13:29:25.970222 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 13 13:29:25.970232 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 13 13:29:25.970243 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 13 13:29:25.970253 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 13 13:29:25.970265 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 13 13:29:25.970275 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 13 13:29:25.970285 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 13 13:29:25.970295 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 13 13:29:25.970304 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 13 13:29:25.970315 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 13 13:29:25.970327 systemd[1]: Reached target slices.target - Slice Units. Dec 13 13:29:25.970337 systemd[1]: Reached target swap.target - Swaps. Dec 13 13:29:25.970348 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 13 13:29:25.970360 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 13 13:29:25.970370 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 13 13:29:25.970380 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 13 13:29:25.970390 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 13 13:29:25.970400 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 13 13:29:25.970411 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 13 13:29:25.970422 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 13 13:29:25.970432 systemd[1]: Mounting media.mount - External Media Directory... Dec 13 13:29:25.970442 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 13 13:29:25.970451 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 13 13:29:25.970461 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 13 13:29:25.970472 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 13 13:29:25.970482 systemd[1]: Reached target machines.target - Containers. Dec 13 13:29:25.970492 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 13 13:29:25.970674 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 13 13:29:25.970704 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 13 13:29:25.970718 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 13 13:29:25.970728 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 13 13:29:25.970744 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 13 13:29:25.970754 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 13 13:29:25.970767 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 13 13:29:25.970779 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 13 13:29:25.970789 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 13 13:29:25.970800 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 13 13:29:25.970810 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 13 13:29:25.970824 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 13 13:29:25.970834 systemd[1]: Stopped systemd-fsck-usr.service. Dec 13 13:29:25.970844 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 13 13:29:25.970854 kernel: fuse: init (API version 7.39) Dec 13 13:29:25.970866 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 13 13:29:25.970876 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 13 13:29:25.970887 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 13 13:29:25.970896 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 13 13:29:25.970906 systemd[1]: verity-setup.service: Deactivated successfully. Dec 13 13:29:25.970916 systemd[1]: Stopped verity-setup.service. Dec 13 13:29:25.970926 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 13 13:29:25.970937 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 13 13:29:25.970993 systemd[1]: Mounted media.mount - External Media Directory. Dec 13 13:29:25.971009 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 13 13:29:25.971019 kernel: loop: module loaded Dec 13 13:29:25.971029 kernel: ACPI: bus type drm_connector registered Dec 13 13:29:25.971038 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 13 13:29:25.972183 systemd-journald[1117]: Collecting audit messages is disabled. Dec 13 13:29:25.972220 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 13 13:29:25.972231 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 13 13:29:25.972241 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 13 13:29:25.972251 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 13 13:29:25.972264 systemd-journald[1117]: Journal started Dec 13 13:29:25.972311 systemd-journald[1117]: Runtime Journal (/run/log/journal/9b8aa4c4d49d483d8d3d4e9b09cd7730) is 8.0M, max 76.5M, 68.5M free. Dec 13 13:29:25.715516 systemd[1]: Queued start job for default target multi-user.target. Dec 13 13:29:25.740581 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 13 13:29:25.741085 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 13 13:29:25.977526 systemd[1]: Started systemd-journald.service - Journal Service. Dec 13 13:29:25.979465 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 13 13:29:25.980599 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 13 13:29:25.981661 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 13 13:29:25.982590 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 13 13:29:25.983662 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 13 13:29:25.983818 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 13 13:29:25.985395 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 13 13:29:25.987077 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 13 13:29:25.988286 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 13 13:29:25.988570 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 13 13:29:25.989461 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 13 13:29:25.992742 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 13 13:29:26.008776 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 13 13:29:26.022424 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 13 13:29:26.031782 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 13 13:29:26.040792 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 13 13:29:26.042979 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 13 13:29:26.043024 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 13 13:29:26.048313 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Dec 13 13:29:26.057716 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 13 13:29:26.073053 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 13 13:29:26.074320 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 13 13:29:26.076737 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 13 13:29:26.081553 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 13 13:29:26.082489 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 13 13:29:26.090814 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 13 13:29:26.093018 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 13 13:29:26.094686 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 13 13:29:26.103836 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 13 13:29:26.118151 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 13 13:29:26.120725 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 13 13:29:26.124569 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 13 13:29:26.125824 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 13 13:29:26.126644 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 13 13:29:26.130115 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 13 13:29:26.133563 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 13 13:29:26.142727 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 13 13:29:26.144480 systemd-journald[1117]: Time spent on flushing to /var/log/journal/9b8aa4c4d49d483d8d3d4e9b09cd7730 is 106.999ms for 1129 entries. Dec 13 13:29:26.144480 systemd-journald[1117]: System Journal (/var/log/journal/9b8aa4c4d49d483d8d3d4e9b09cd7730) is 8.0M, max 584.8M, 576.8M free. Dec 13 13:29:26.273096 systemd-journald[1117]: Received client request to flush runtime journal. Dec 13 13:29:26.273157 kernel: loop0: detected capacity change from 0 to 8 Dec 13 13:29:26.273180 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 13 13:29:26.273196 kernel: loop1: detected capacity change from 0 to 113552 Dec 13 13:29:26.273221 kernel: loop2: detected capacity change from 0 to 116784 Dec 13 13:29:26.160831 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Dec 13 13:29:26.165048 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Dec 13 13:29:26.179636 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 13 13:29:26.215617 systemd-tmpfiles[1168]: ACLs are not supported, ignoring. Dec 13 13:29:26.215628 systemd-tmpfiles[1168]: ACLs are not supported, ignoring. Dec 13 13:29:26.218108 udevadm[1178]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Dec 13 13:29:26.231219 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 13 13:29:26.233911 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Dec 13 13:29:26.238781 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 13 13:29:26.251827 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 13 13:29:26.282397 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 13 13:29:26.304005 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 13 13:29:26.310554 kernel: loop3: detected capacity change from 0 to 194512 Dec 13 13:29:26.314026 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 13 13:29:26.353314 systemd-tmpfiles[1192]: ACLs are not supported, ignoring. Dec 13 13:29:26.356898 kernel: loop4: detected capacity change from 0 to 8 Dec 13 13:29:26.353340 systemd-tmpfiles[1192]: ACLs are not supported, ignoring. Dec 13 13:29:26.359287 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 13 13:29:26.362767 kernel: loop5: detected capacity change from 0 to 113552 Dec 13 13:29:26.377581 kernel: loop6: detected capacity change from 0 to 116784 Dec 13 13:29:26.389567 kernel: loop7: detected capacity change from 0 to 194512 Dec 13 13:29:26.403644 (sd-merge)[1195]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Dec 13 13:29:26.404232 (sd-merge)[1195]: Merged extensions into '/usr'. Dec 13 13:29:26.409735 systemd[1]: Reloading requested from client PID 1167 ('systemd-sysext') (unit systemd-sysext.service)... Dec 13 13:29:26.409899 systemd[1]: Reloading... Dec 13 13:29:26.560531 zram_generator::config[1225]: No configuration found. Dec 13 13:29:26.690249 ldconfig[1162]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 13 13:29:26.710717 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Dec 13 13:29:26.759072 systemd[1]: Reloading finished in 348 ms. Dec 13 13:29:26.794355 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 13 13:29:26.795642 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 13 13:29:26.805855 systemd[1]: Starting ensure-sysext.service... Dec 13 13:29:26.810841 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 13 13:29:26.818817 systemd[1]: Reloading requested from client PID 1259 ('systemctl') (unit ensure-sysext.service)... Dec 13 13:29:26.818839 systemd[1]: Reloading... Dec 13 13:29:26.860236 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 13 13:29:26.860449 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 13 13:29:26.861206 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 13 13:29:26.861419 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Dec 13 13:29:26.861464 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Dec 13 13:29:26.871879 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Dec 13 13:29:26.871897 systemd-tmpfiles[1260]: Skipping /boot Dec 13 13:29:26.885443 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Dec 13 13:29:26.885462 systemd-tmpfiles[1260]: Skipping /boot Dec 13 13:29:26.912537 zram_generator::config[1282]: No configuration found. Dec 13 13:29:27.026844 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Dec 13 13:29:27.075094 systemd[1]: Reloading finished in 255 ms. Dec 13 13:29:27.094531 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 13 13:29:27.108429 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 13 13:29:27.129490 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 13 13:29:27.134757 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 13 13:29:27.139648 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 13 13:29:27.147554 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 13 13:29:27.157874 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 13 13:29:27.164832 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 13 13:29:27.167738 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 13 13:29:27.172890 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 13 13:29:27.178616 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 13 13:29:27.184815 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 13 13:29:27.185551 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 13 13:29:27.190955 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 13 13:29:27.195471 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 13 13:29:27.195648 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 13 13:29:27.200406 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 13 13:29:27.202785 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 13 13:29:27.204708 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 13 13:29:27.212677 systemd[1]: Finished ensure-sysext.service. Dec 13 13:29:27.213791 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 13 13:29:27.220824 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 13 13:29:27.226449 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 13 13:29:27.227386 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 13 13:29:27.228852 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 13 13:29:27.235328 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 13 13:29:27.235621 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 13 13:29:27.244998 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 13 13:29:27.246815 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 13 13:29:27.247595 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 13 13:29:27.251116 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 13 13:29:27.265314 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 13 13:29:27.268034 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 13 13:29:27.268189 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 13 13:29:27.270084 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 13 13:29:27.280879 systemd-udevd[1335]: Using default interface naming scheme 'v255'. Dec 13 13:29:27.290146 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 13 13:29:27.292200 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 13 13:29:27.296457 augenrules[1366]: No rules Dec 13 13:29:27.298150 systemd[1]: audit-rules.service: Deactivated successfully. Dec 13 13:29:27.298359 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 13 13:29:27.312626 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 13 13:29:27.322185 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 13 13:29:27.323709 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 13 13:29:27.444632 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 13 13:29:27.460519 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 13 13:29:27.461331 systemd[1]: Reached target time-set.target - System Time Set. Dec 13 13:29:27.466975 systemd-resolved[1329]: Positive Trust Anchors: Dec 13 13:29:27.467337 systemd-networkd[1374]: lo: Link UP Dec 13 13:29:27.467349 systemd-networkd[1374]: lo: Gained carrier Dec 13 13:29:27.467656 systemd-resolved[1329]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 13 13:29:27.467738 systemd-resolved[1329]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 13 13:29:27.468635 systemd-timesyncd[1347]: No network connectivity, watching for changes. Dec 13 13:29:27.470681 systemd-networkd[1374]: Enumeration completed Dec 13 13:29:27.470803 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 13 13:29:27.474271 systemd-resolved[1329]: Using system hostname 'ci-4186-0-0-4-8ed7fad560'. Dec 13 13:29:27.484849 kernel: BTRFS info: devid 1 device path /dev/mapper/usr changed to /dev/dm-0 scanned by (udev-worker) (1382) Dec 13 13:29:27.487283 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 13 13:29:27.489077 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 13 13:29:27.491753 systemd[1]: Reached target network.target - Network. Dec 13 13:29:27.492381 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 13 13:29:27.506606 kernel: BTRFS info: devid 1 device path /dev/dm-0 changed to /dev/mapper/usr scanned by (udev-worker) (1382) Dec 13 13:29:27.544509 systemd-networkd[1374]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:27.545475 systemd-networkd[1374]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 13 13:29:27.547359 systemd-networkd[1374]: eth0: Link UP Dec 13 13:29:27.547372 systemd-networkd[1374]: eth0: Gained carrier Dec 13 13:29:27.547395 systemd-networkd[1374]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:27.578942 systemd-networkd[1374]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:27.578953 systemd-networkd[1374]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 13 13:29:27.580329 systemd-networkd[1374]: eth1: Link UP Dec 13 13:29:27.580341 systemd-networkd[1374]: eth1: Gained carrier Dec 13 13:29:27.580361 systemd-networkd[1374]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 13 13:29:27.582989 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1382) Dec 13 13:29:27.614534 kernel: mousedev: PS/2 mouse device common for all mice Dec 13 13:29:27.616896 systemd-networkd[1374]: eth0: DHCPv4 address 188.245.225.138/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 13 13:29:27.617732 systemd-timesyncd[1347]: Network configuration changed, trying to establish connection. Dec 13 13:29:27.642571 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 13 13:29:27.642704 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 13 13:29:27.650152 systemd-networkd[1374]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Dec 13 13:29:27.652091 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 13 13:29:27.655718 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 13 13:29:27.661779 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 13 13:29:27.662614 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 13 13:29:27.662660 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 13 13:29:27.663042 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 13 13:29:27.663242 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 13 13:29:27.688081 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 13 13:29:27.688556 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 13 13:29:27.694911 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 13 13:29:27.696585 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 13 13:29:27.697717 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 13 13:29:27.697773 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 13 13:29:27.722941 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 13 13:29:27.726458 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 13 13:29:27.730529 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Dec 13 13:29:27.730594 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 13 13:29:27.730631 kernel: [drm] features: -context_init Dec 13 13:29:27.731512 kernel: [drm] number of scanouts: 1 Dec 13 13:29:27.731555 kernel: [drm] number of cap sets: 0 Dec 13 13:29:27.735726 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Dec 13 13:29:27.738808 systemd-timesyncd[1347]: Contacted time server 144.76.59.106:123 (1.flatcar.pool.ntp.org). Dec 13 13:29:27.738877 systemd-timesyncd[1347]: Initial clock synchronization to Fri 2024-12-13 13:29:27.964034 UTC. Dec 13 13:29:27.742774 kernel: Console: switching to colour frame buffer device 160x50 Dec 13 13:29:27.746860 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 13 13:29:27.756528 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 13 13:29:27.770111 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 13 13:29:27.772575 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 13 13:29:27.779827 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 13 13:29:27.781894 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 13 13:29:27.842683 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 13 13:29:27.880531 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Dec 13 13:29:27.889847 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Dec 13 13:29:27.903534 lvm[1444]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Dec 13 13:29:27.934072 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Dec 13 13:29:27.937215 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 13 13:29:27.938166 systemd[1]: Reached target sysinit.target - System Initialization. Dec 13 13:29:27.938972 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 13 13:29:27.939799 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 13 13:29:27.940716 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 13 13:29:27.941421 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 13 13:29:27.942258 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 13 13:29:27.943851 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 13 13:29:27.943901 systemd[1]: Reached target paths.target - Path Units. Dec 13 13:29:27.944525 systemd[1]: Reached target timers.target - Timer Units. Dec 13 13:29:27.946267 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 13 13:29:27.948409 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 13 13:29:27.957474 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 13 13:29:27.960322 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Dec 13 13:29:27.961949 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 13 13:29:27.962828 systemd[1]: Reached target sockets.target - Socket Units. Dec 13 13:29:27.963509 systemd[1]: Reached target basic.target - Basic System. Dec 13 13:29:27.964215 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 13 13:29:27.964255 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 13 13:29:27.967720 systemd[1]: Starting containerd.service - containerd container runtime... Dec 13 13:29:27.972965 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 13 13:29:27.977116 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 13 13:29:27.980615 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 13 13:29:27.985595 lvm[1448]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Dec 13 13:29:27.994732 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 13 13:29:27.995338 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 13 13:29:27.999522 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 13 13:29:28.002189 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 13 13:29:28.013765 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 13 13:29:28.015878 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 13 13:29:28.021799 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 13 13:29:28.030007 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 13 13:29:28.031373 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 13 13:29:28.033027 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 13 13:29:28.036619 systemd[1]: Starting update-engine.service - Update Engine... Dec 13 13:29:28.043094 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 13 13:29:28.046566 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Dec 13 13:29:28.049998 systemd[1]: motdgen.service: Deactivated successfully. Dec 13 13:29:28.051497 jq[1452]: false Dec 13 13:29:28.051652 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 13 13:29:28.052968 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 13 13:29:28.054297 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 13 13:29:28.065166 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 13 13:29:28.065394 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 13 13:29:28.087700 jq[1470]: true Dec 13 13:29:28.089222 extend-filesystems[1453]: Found loop4 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found loop5 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found loop6 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found loop7 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda1 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda2 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda3 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found usr Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda4 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda6 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda7 Dec 13 13:29:28.089222 extend-filesystems[1453]: Found sda9 Dec 13 13:29:28.089222 extend-filesystems[1453]: Checking size of /dev/sda9 Dec 13 13:29:28.147585 coreos-metadata[1450]: Dec 13 13:29:28.088 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 13 13:29:28.147585 coreos-metadata[1450]: Dec 13 13:29:28.095 INFO Fetch successful Dec 13 13:29:28.147585 coreos-metadata[1450]: Dec 13 13:29:28.096 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 13 13:29:28.147585 coreos-metadata[1450]: Dec 13 13:29:28.097 INFO Fetch successful Dec 13 13:29:28.132107 dbus-daemon[1451]: [system] SELinux support is enabled Dec 13 13:29:28.160357 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Dec 13 13:29:28.117816 (ntainerd)[1480]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 13 13:29:28.161713 extend-filesystems[1453]: Resized partition /dev/sda9 Dec 13 13:29:28.132358 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 13 13:29:28.164590 extend-filesystems[1492]: resize2fs 1.47.1 (20-May-2024) Dec 13 13:29:28.138964 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 13 13:29:28.170662 update_engine[1466]: I20241213 13:29:28.164916 1466 main.cc:92] Flatcar Update Engine starting Dec 13 13:29:28.139020 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 13 13:29:28.171012 tar[1479]: linux-arm64/helm Dec 13 13:29:28.146792 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 13 13:29:28.146815 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 13 13:29:28.176002 jq[1482]: true Dec 13 13:29:28.184403 update_engine[1466]: I20241213 13:29:28.181772 1466 update_check_scheduler.cc:74] Next update check in 3m42s Dec 13 13:29:28.190139 systemd[1]: Started update-engine.service - Update Engine. Dec 13 13:29:28.196969 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 13 13:29:28.269830 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 13 13:29:28.281338 systemd-logind[1464]: New seat seat0. Dec 13 13:29:28.285418 systemd-logind[1464]: Watching system buttons on /dev/input/event0 (Power Button) Dec 13 13:29:28.288410 systemd-logind[1464]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 13 13:29:28.340707 systemd[1]: Started systemd-logind.service - User Login Management. Dec 13 13:29:28.346374 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 13 13:29:28.349555 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1387) Dec 13 13:29:28.352230 bash[1521]: Updated "/home/core/.ssh/authorized_keys" Dec 13 13:29:28.355384 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 13 13:29:28.364166 systemd[1]: Starting sshkeys.service... Dec 13 13:29:28.420657 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Dec 13 13:29:28.438774 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 13 13:29:28.450746 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 13 13:29:28.469316 extend-filesystems[1492]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 13 13:29:28.469316 extend-filesystems[1492]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 13 13:29:28.469316 extend-filesystems[1492]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Dec 13 13:29:28.476617 extend-filesystems[1453]: Resized filesystem in /dev/sda9 Dec 13 13:29:28.476617 extend-filesystems[1453]: Found sr0 Dec 13 13:29:28.482197 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 13 13:29:28.485632 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 13 13:29:28.513575 containerd[1480]: time="2024-12-13T13:29:28.512970995Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Dec 13 13:29:28.553773 coreos-metadata[1528]: Dec 13 13:29:28.553 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 13 13:29:28.558755 coreos-metadata[1528]: Dec 13 13:29:28.558 INFO Fetch successful Dec 13 13:29:28.569640 unknown[1528]: wrote ssh authorized keys file for user: core Dec 13 13:29:28.595247 containerd[1480]: time="2024-12-13T13:29:28.593566497Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.597725 containerd[1480]: time="2024-12-13T13:29:28.597678182Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.65-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Dec 13 13:29:28.598077 containerd[1480]: time="2024-12-13T13:29:28.598059016Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Dec 13 13:29:28.598144 containerd[1480]: time="2024-12-13T13:29:28.598128520Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Dec 13 13:29:28.598410 containerd[1480]: time="2024-12-13T13:29:28.598386796Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599549901Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599657159Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599675419Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599872293Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599887921Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599902357Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Dec 13 13:29:28.599954 containerd[1480]: time="2024-12-13T13:29:28.599912227Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.600841 containerd[1480]: time="2024-12-13T13:29:28.600814177Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.601167 containerd[1480]: time="2024-12-13T13:29:28.601143890Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Dec 13 13:29:28.601897 containerd[1480]: time="2024-12-13T13:29:28.601871751Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Dec 13 13:29:28.601970 containerd[1480]: time="2024-12-13T13:29:28.601956760Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Dec 13 13:29:28.602614 containerd[1480]: time="2024-12-13T13:29:28.602587561Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Dec 13 13:29:28.602769 containerd[1480]: time="2024-12-13T13:29:28.602750259Z" level=info msg="metadata content store policy set" policy=shared Dec 13 13:29:28.612453 containerd[1480]: time="2024-12-13T13:29:28.612407101Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Dec 13 13:29:28.614214 containerd[1480]: time="2024-12-13T13:29:28.613612073Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Dec 13 13:29:28.614214 containerd[1480]: time="2024-12-13T13:29:28.613654803Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Dec 13 13:29:28.614214 containerd[1480]: time="2024-12-13T13:29:28.613727228Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Dec 13 13:29:28.614214 containerd[1480]: time="2024-12-13T13:29:28.613745611Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Dec 13 13:29:28.614214 containerd[1480]: time="2024-12-13T13:29:28.613940675Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Dec 13 13:29:28.614452 containerd[1480]: time="2024-12-13T13:29:28.614421981Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Dec 13 13:29:28.614521 update-ssh-keys[1538]: Updated "/home/core/.ssh/authorized_keys" Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615804867Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615838879Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615857962Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615875317Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615889712Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615912866Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615930468Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615947330Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615961766Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615975255Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.615987388Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.616011899Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.616027980Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.618714 containerd[1480]: time="2024-12-13T13:29:28.616041181Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616054013Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616067832Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616082802Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616094482Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616106984Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616119363Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616134005Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616146384Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616158146Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616171759Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616188086Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616212104Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616227074Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.619022 containerd[1480]: time="2024-12-13T13:29:28.616249735Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Dec 13 13:29:28.619348 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 13 13:29:28.622625 systemd[1]: Finished sshkeys.service. Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628662191Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628719933Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628734122Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628747858Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628759086Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628774261Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628785654Z" level=info msg="NRI interface is disabled by configuration." Dec 13 13:29:28.630259 containerd[1480]: time="2024-12-13T13:29:28.628798608Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Dec 13 13:29:28.630505 containerd[1480]: time="2024-12-13T13:29:28.629225463Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Dec 13 13:29:28.630505 containerd[1480]: time="2024-12-13T13:29:28.629279174Z" level=info msg="Connect containerd service" Dec 13 13:29:28.630505 containerd[1480]: time="2024-12-13T13:29:28.629350036Z" level=info msg="using legacy CRI server" Dec 13 13:29:28.630505 containerd[1480]: time="2024-12-13T13:29:28.629360276Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 13 13:29:28.630505 containerd[1480]: time="2024-12-13T13:29:28.629636977Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Dec 13 13:29:28.633331 containerd[1480]: time="2024-12-13T13:29:28.632833387Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 13 13:29:28.633736 containerd[1480]: time="2024-12-13T13:29:28.633699886Z" level=info msg="Start subscribing containerd event" Dec 13 13:29:28.634007 containerd[1480]: time="2024-12-13T13:29:28.633989582Z" level=info msg="Start recovering state" Dec 13 13:29:28.634576 containerd[1480]: time="2024-12-13T13:29:28.633933485Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 13 13:29:28.634712 containerd[1480]: time="2024-12-13T13:29:28.634694330Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 13 13:29:28.640114 containerd[1480]: time="2024-12-13T13:29:28.639089542Z" level=info msg="Start event monitor" Dec 13 13:29:28.640114 containerd[1480]: time="2024-12-13T13:29:28.639131327Z" level=info msg="Start snapshots syncer" Dec 13 13:29:28.640114 containerd[1480]: time="2024-12-13T13:29:28.639144035Z" level=info msg="Start cni network conf syncer for default" Dec 13 13:29:28.640114 containerd[1480]: time="2024-12-13T13:29:28.639157854Z" level=info msg="Start streaming server" Dec 13 13:29:28.640114 containerd[1480]: time="2024-12-13T13:29:28.639317014Z" level=info msg="containerd successfully booted in 0.130392s" Dec 13 13:29:28.639482 systemd[1]: Started containerd.service - containerd container runtime. Dec 13 13:29:28.704349 locksmithd[1501]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 13 13:29:28.758716 systemd-networkd[1374]: eth0: Gained IPv6LL Dec 13 13:29:28.767629 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 13 13:29:28.771057 systemd[1]: Reached target network-online.target - Network is Online. Dec 13 13:29:28.782756 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:29:28.787976 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 13 13:29:28.842477 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 13 13:29:28.864415 tar[1479]: linux-arm64/LICENSE Dec 13 13:29:28.864558 tar[1479]: linux-arm64/README.md Dec 13 13:29:28.880263 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 13 13:29:29.386340 sshd_keygen[1496]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 13 13:29:29.415138 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 13 13:29:29.424358 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 13 13:29:29.431009 systemd[1]: issuegen.service: Deactivated successfully. Dec 13 13:29:29.432093 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 13 13:29:29.445138 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 13 13:29:29.458200 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 13 13:29:29.464182 systemd-networkd[1374]: eth1: Gained IPv6LL Dec 13 13:29:29.466209 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 13 13:29:29.479049 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 13 13:29:29.480752 systemd[1]: Reached target getty.target - Login Prompts. Dec 13 13:29:29.584098 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:29:29.585615 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 13 13:29:29.589676 systemd[1]: Startup finished in 778ms (kernel) + 5.526s (initrd) + 4.413s (userspace) = 10.718s. Dec 13 13:29:29.593946 (kubelet)[1581]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:29:29.607289 agetty[1574]: failed to open credentials directory Dec 13 13:29:29.607355 agetty[1575]: failed to open credentials directory Dec 13 13:29:30.257699 kubelet[1581]: E1213 13:29:30.257543 1581 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:29:30.260811 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:29:30.260993 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:29:40.511563 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 13 13:29:40.527944 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:29:40.652795 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:29:40.672131 (kubelet)[1601]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:29:40.734399 kubelet[1601]: E1213 13:29:40.734329 1601 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:29:40.739374 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:29:40.739586 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:29:50.941405 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 13 13:29:50.948807 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:29:51.064457 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:29:51.077690 (kubelet)[1618]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:29:51.144541 kubelet[1618]: E1213 13:29:51.144409 1618 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:29:51.149031 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:29:51.150512 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:30:01.191395 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 13 13:30:01.198828 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:30:01.302253 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:30:01.307806 (kubelet)[1634]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:30:01.370669 kubelet[1634]: E1213 13:30:01.370437 1634 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:30:01.374853 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:30:01.375036 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:30:11.441452 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 13 13:30:11.458431 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:30:11.591888 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:30:11.592580 (kubelet)[1651]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:30:11.645562 kubelet[1651]: E1213 13:30:11.645438 1651 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:30:11.648754 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:30:11.648910 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:30:13.911737 update_engine[1466]: I20241213 13:30:13.911041 1466 update_attempter.cc:509] Updating boot flags... Dec 13 13:30:13.965529 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1668) Dec 13 13:30:14.031619 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 38 scanned by (udev-worker) (1668) Dec 13 13:30:21.691148 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 13 13:30:21.700140 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:30:21.818811 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:30:21.829943 (kubelet)[1685]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:30:21.886843 kubelet[1685]: E1213 13:30:21.886692 1685 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:30:21.889419 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:30:21.889614 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:30:31.941808 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Dec 13 13:30:31.953883 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:30:32.067463 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:30:32.078929 (kubelet)[1701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:30:32.145535 kubelet[1701]: E1213 13:30:32.145410 1701 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:30:32.148784 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:30:32.149095 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:30:42.192043 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Dec 13 13:30:42.199825 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:30:42.333731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:30:42.339984 (kubelet)[1717]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:30:42.398148 kubelet[1717]: E1213 13:30:42.398050 1717 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:30:42.401829 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:30:42.402127 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:30:52.441172 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Dec 13 13:30:52.449866 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:30:52.567742 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:30:52.577137 (kubelet)[1734]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:30:52.633739 kubelet[1734]: E1213 13:30:52.633579 1734 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:30:52.636195 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:30:52.636333 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:31:02.691281 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Dec 13 13:31:02.698797 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:02.813455 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:02.825067 (kubelet)[1750]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:31:02.878722 kubelet[1750]: E1213 13:31:02.878544 1750 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:31:02.880941 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:31:02.881071 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:31:12.941395 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Dec 13 13:31:12.953982 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:13.084974 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:13.099147 (kubelet)[1765]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:31:13.165360 kubelet[1765]: E1213 13:31:13.165244 1765 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:31:13.169357 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:31:13.169727 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:31:13.924626 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 13 13:31:13.936891 systemd[1]: Started sshd@0-188.245.225.138:22-147.75.109.163:48462.service - OpenSSH per-connection server daemon (147.75.109.163:48462). Dec 13 13:31:14.922812 sshd[1775]: Accepted publickey for core from 147.75.109.163 port 48462 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:14.927246 sshd-session[1775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:14.937349 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 13 13:31:14.942863 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 13 13:31:14.946737 systemd-logind[1464]: New session 1 of user core. Dec 13 13:31:14.955492 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 13 13:31:14.963093 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 13 13:31:14.967916 (systemd)[1779]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 13 13:31:15.082811 systemd[1779]: Queued start job for default target default.target. Dec 13 13:31:15.095562 systemd[1779]: Created slice app.slice - User Application Slice. Dec 13 13:31:15.095617 systemd[1779]: Reached target paths.target - Paths. Dec 13 13:31:15.095644 systemd[1779]: Reached target timers.target - Timers. Dec 13 13:31:15.098323 systemd[1779]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 13 13:31:15.122298 systemd[1779]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 13 13:31:15.122438 systemd[1779]: Reached target sockets.target - Sockets. Dec 13 13:31:15.122452 systemd[1779]: Reached target basic.target - Basic System. Dec 13 13:31:15.122515 systemd[1779]: Reached target default.target - Main User Target. Dec 13 13:31:15.122557 systemd[1779]: Startup finished in 146ms. Dec 13 13:31:15.122921 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 13 13:31:15.135052 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 13 13:31:15.827631 systemd[1]: Started sshd@1-188.245.225.138:22-147.75.109.163:48478.service - OpenSSH per-connection server daemon (147.75.109.163:48478). Dec 13 13:31:16.813376 sshd[1790]: Accepted publickey for core from 147.75.109.163 port 48478 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:16.815597 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:16.822230 systemd-logind[1464]: New session 2 of user core. Dec 13 13:31:16.832804 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 13 13:31:17.494955 sshd[1792]: Connection closed by 147.75.109.163 port 48478 Dec 13 13:31:17.496302 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Dec 13 13:31:17.501693 systemd[1]: sshd@1-188.245.225.138:22-147.75.109.163:48478.service: Deactivated successfully. Dec 13 13:31:17.503986 systemd[1]: session-2.scope: Deactivated successfully. Dec 13 13:31:17.508060 systemd-logind[1464]: Session 2 logged out. Waiting for processes to exit. Dec 13 13:31:17.509435 systemd-logind[1464]: Removed session 2. Dec 13 13:31:17.674695 systemd[1]: Started sshd@2-188.245.225.138:22-147.75.109.163:45558.service - OpenSSH per-connection server daemon (147.75.109.163:45558). Dec 13 13:31:18.662588 sshd[1797]: Accepted publickey for core from 147.75.109.163 port 45558 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:18.665241 sshd-session[1797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:18.670897 systemd-logind[1464]: New session 3 of user core. Dec 13 13:31:18.678864 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 13 13:31:19.344256 sshd[1799]: Connection closed by 147.75.109.163 port 45558 Dec 13 13:31:19.344936 sshd-session[1797]: pam_unix(sshd:session): session closed for user core Dec 13 13:31:19.351529 systemd[1]: sshd@2-188.245.225.138:22-147.75.109.163:45558.service: Deactivated successfully. Dec 13 13:31:19.354225 systemd[1]: session-3.scope: Deactivated successfully. Dec 13 13:31:19.356352 systemd-logind[1464]: Session 3 logged out. Waiting for processes to exit. Dec 13 13:31:19.359248 systemd-logind[1464]: Removed session 3. Dec 13 13:31:19.537260 systemd[1]: Started sshd@3-188.245.225.138:22-147.75.109.163:45560.service - OpenSSH per-connection server daemon (147.75.109.163:45560). Dec 13 13:31:20.509604 sshd[1804]: Accepted publickey for core from 147.75.109.163 port 45560 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:20.512344 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:20.519576 systemd-logind[1464]: New session 4 of user core. Dec 13 13:31:20.528962 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 13 13:31:21.188953 sshd[1806]: Connection closed by 147.75.109.163 port 45560 Dec 13 13:31:21.189642 sshd-session[1804]: pam_unix(sshd:session): session closed for user core Dec 13 13:31:21.195936 systemd[1]: sshd@3-188.245.225.138:22-147.75.109.163:45560.service: Deactivated successfully. Dec 13 13:31:21.200140 systemd[1]: session-4.scope: Deactivated successfully. Dec 13 13:31:21.203167 systemd-logind[1464]: Session 4 logged out. Waiting for processes to exit. Dec 13 13:31:21.204454 systemd-logind[1464]: Removed session 4. Dec 13 13:31:21.355475 systemd[1]: Started sshd@4-188.245.225.138:22-147.75.109.163:45570.service - OpenSSH per-connection server daemon (147.75.109.163:45570). Dec 13 13:31:22.343893 sshd[1811]: Accepted publickey for core from 147.75.109.163 port 45570 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:22.345967 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:22.355004 systemd-logind[1464]: New session 5 of user core. Dec 13 13:31:22.360816 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 13 13:31:22.873280 sudo[1814]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 13 13:31:22.873757 sudo[1814]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 13 13:31:22.896580 sudo[1814]: pam_unix(sudo:session): session closed for user root Dec 13 13:31:23.057532 sshd[1813]: Connection closed by 147.75.109.163 port 45570 Dec 13 13:31:23.059266 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Dec 13 13:31:23.064841 systemd[1]: sshd@4-188.245.225.138:22-147.75.109.163:45570.service: Deactivated successfully. Dec 13 13:31:23.066754 systemd[1]: session-5.scope: Deactivated successfully. Dec 13 13:31:23.067853 systemd-logind[1464]: Session 5 logged out. Waiting for processes to exit. Dec 13 13:31:23.069127 systemd-logind[1464]: Removed session 5. Dec 13 13:31:23.191963 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Dec 13 13:31:23.202118 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:23.225009 systemd[1]: Started sshd@5-188.245.225.138:22-147.75.109.163:45580.service - OpenSSH per-connection server daemon (147.75.109.163:45580). Dec 13 13:31:23.330838 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:23.343126 (kubelet)[1829]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:31:23.400418 kubelet[1829]: E1213 13:31:23.400361 1829 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:31:23.403382 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:31:23.403807 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:31:24.217615 sshd[1822]: Accepted publickey for core from 147.75.109.163 port 45580 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:24.219909 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:24.226778 systemd-logind[1464]: New session 6 of user core. Dec 13 13:31:24.236876 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 13 13:31:24.741669 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 13 13:31:24.742059 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 13 13:31:24.747092 sudo[1838]: pam_unix(sudo:session): session closed for user root Dec 13 13:31:24.754629 sudo[1837]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 13 13:31:24.754967 sudo[1837]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 13 13:31:24.776982 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 13 13:31:24.811456 augenrules[1860]: No rules Dec 13 13:31:24.812886 systemd[1]: audit-rules.service: Deactivated successfully. Dec 13 13:31:24.813094 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 13 13:31:24.814457 sudo[1837]: pam_unix(sudo:session): session closed for user root Dec 13 13:31:24.973681 sshd[1836]: Connection closed by 147.75.109.163 port 45580 Dec 13 13:31:24.974590 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Dec 13 13:31:24.979538 systemd[1]: sshd@5-188.245.225.138:22-147.75.109.163:45580.service: Deactivated successfully. Dec 13 13:31:24.982011 systemd[1]: session-6.scope: Deactivated successfully. Dec 13 13:31:24.983915 systemd-logind[1464]: Session 6 logged out. Waiting for processes to exit. Dec 13 13:31:24.985175 systemd-logind[1464]: Removed session 6. Dec 13 13:31:25.144919 systemd[1]: Started sshd@6-188.245.225.138:22-147.75.109.163:45592.service - OpenSSH per-connection server daemon (147.75.109.163:45592). Dec 13 13:31:26.122130 sshd[1868]: Accepted publickey for core from 147.75.109.163 port 45592 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:31:26.124649 sshd-session[1868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:31:26.136863 systemd-logind[1464]: New session 7 of user core. Dec 13 13:31:26.139907 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 13 13:31:26.641108 sudo[1871]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 13 13:31:26.641456 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 13 13:31:26.978981 (dockerd)[1890]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 13 13:31:26.979487 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 13 13:31:27.232434 dockerd[1890]: time="2024-12-13T13:31:27.232097514Z" level=info msg="Starting up" Dec 13 13:31:27.334154 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2751500172-merged.mount: Deactivated successfully. Dec 13 13:31:27.388738 dockerd[1890]: time="2024-12-13T13:31:27.388637049Z" level=info msg="Loading containers: start." Dec 13 13:31:27.551691 kernel: Initializing XFRM netlink socket Dec 13 13:31:27.655278 systemd-networkd[1374]: docker0: Link UP Dec 13 13:31:27.696548 dockerd[1890]: time="2024-12-13T13:31:27.696460013Z" level=info msg="Loading containers: done." Dec 13 13:31:27.715340 dockerd[1890]: time="2024-12-13T13:31:27.714693263Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 13 13:31:27.715340 dockerd[1890]: time="2024-12-13T13:31:27.714860304Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Dec 13 13:31:27.715340 dockerd[1890]: time="2024-12-13T13:31:27.715064385Z" level=info msg="Daemon has completed initialization" Dec 13 13:31:27.761254 dockerd[1890]: time="2024-12-13T13:31:27.761141694Z" level=info msg="API listen on /run/docker.sock" Dec 13 13:31:27.762391 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 13 13:31:28.923055 containerd[1480]: time="2024-12-13T13:31:28.922607580Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\"" Dec 13 13:31:29.579514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3233949483.mount: Deactivated successfully. Dec 13 13:31:31.778538 containerd[1480]: time="2024-12-13T13:31:31.777651723Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:31.779402 containerd[1480]: time="2024-12-13T13:31:31.779342859Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.12: active requests=0, bytes read=32201342" Dec 13 13:31:31.781004 containerd[1480]: time="2024-12-13T13:31:31.780930314Z" level=info msg="ImageCreate event name:\"sha256:50c86b7f73fdd28bacd4abf45260c9d3abc3b57eb038fa61fc45b5d0f2763e6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:31.785623 containerd[1480]: time="2024-12-13T13:31:31.785562917Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:31.787368 containerd[1480]: time="2024-12-13T13:31:31.786843289Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.12\" with image id \"sha256:50c86b7f73fdd28bacd4abf45260c9d3abc3b57eb038fa61fc45b5d0f2763e6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\", size \"32198050\" in 2.863383664s" Dec 13 13:31:31.787368 containerd[1480]: time="2024-12-13T13:31:31.786889769Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\" returns image reference \"sha256:50c86b7f73fdd28bacd4abf45260c9d3abc3b57eb038fa61fc45b5d0f2763e6f\"" Dec 13 13:31:31.818846 containerd[1480]: time="2024-12-13T13:31:31.818744666Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\"" Dec 13 13:31:33.441253 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Dec 13 13:31:33.452595 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:33.575141 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:33.589607 (kubelet)[2151]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:31:33.646454 kubelet[2151]: E1213 13:31:33.646186 2151 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:31:33.649856 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:31:33.650107 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:31:34.627126 containerd[1480]: time="2024-12-13T13:31:34.625790542Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:34.628422 containerd[1480]: time="2024-12-13T13:31:34.628370414Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.12: active requests=0, bytes read=29381317" Dec 13 13:31:34.629865 containerd[1480]: time="2024-12-13T13:31:34.629798831Z" level=info msg="ImageCreate event name:\"sha256:2d47abaa6ccc533f84ef74fff6d509de10bb040317351b45afe95a8021a1ddf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:34.633180 containerd[1480]: time="2024-12-13T13:31:34.633144753Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:34.634524 containerd[1480]: time="2024-12-13T13:31:34.634462489Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.12\" with image id \"sha256:2d47abaa6ccc533f84ef74fff6d509de10bb040317351b45afe95a8021a1ddf7\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\", size \"30783618\" in 2.815672142s" Dec 13 13:31:34.634643 containerd[1480]: time="2024-12-13T13:31:34.634621451Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\" returns image reference \"sha256:2d47abaa6ccc533f84ef74fff6d509de10bb040317351b45afe95a8021a1ddf7\"" Dec 13 13:31:34.659694 containerd[1480]: time="2024-12-13T13:31:34.659660198Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\"" Dec 13 13:31:36.410909 containerd[1480]: time="2024-12-13T13:31:36.410821432Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:36.413228 containerd[1480]: time="2024-12-13T13:31:36.412579536Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.12: active requests=0, bytes read=15765660" Dec 13 13:31:36.415409 containerd[1480]: time="2024-12-13T13:31:36.415342895Z" level=info msg="ImageCreate event name:\"sha256:ae633c52a23907b58f7a7867d2cccf3d3f5ebd8977beb6788e20fbecd3f446db\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:36.420361 containerd[1480]: time="2024-12-13T13:31:36.420290765Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:36.421608 containerd[1480]: time="2024-12-13T13:31:36.421564823Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.12\" with image id \"sha256:ae633c52a23907b58f7a7867d2cccf3d3f5ebd8977beb6788e20fbecd3f446db\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\", size \"17167979\" in 1.761866144s" Dec 13 13:31:36.421743 containerd[1480]: time="2024-12-13T13:31:36.421728345Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\" returns image reference \"sha256:ae633c52a23907b58f7a7867d2cccf3d3f5ebd8977beb6788e20fbecd3f446db\"" Dec 13 13:31:36.449474 containerd[1480]: time="2024-12-13T13:31:36.449427734Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\"" Dec 13 13:31:37.352866 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2486540437.mount: Deactivated successfully. Dec 13 13:31:37.994796 containerd[1480]: time="2024-12-13T13:31:37.994003230Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:37.995273 containerd[1480]: time="2024-12-13T13:31:37.995227728Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.12: active requests=0, bytes read=25274003" Dec 13 13:31:37.996415 containerd[1480]: time="2024-12-13T13:31:37.996387786Z" level=info msg="ImageCreate event name:\"sha256:768ee8cfd9311233d038d18430c18136e1ae4dd2e6de40fcf1c670bba2da6d06\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:37.999094 containerd[1480]: time="2024-12-13T13:31:37.999046305Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:37.999908 containerd[1480]: time="2024-12-13T13:31:37.999871878Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.12\" with image id \"sha256:768ee8cfd9311233d038d18430c18136e1ae4dd2e6de40fcf1c670bba2da6d06\", repo tag \"registry.k8s.io/kube-proxy:v1.29.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\", size \"25272996\" in 1.550393542s" Dec 13 13:31:38.000065 containerd[1480]: time="2024-12-13T13:31:38.000044800Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\" returns image reference \"sha256:768ee8cfd9311233d038d18430c18136e1ae4dd2e6de40fcf1c670bba2da6d06\"" Dec 13 13:31:38.021395 containerd[1480]: time="2024-12-13T13:31:38.021360295Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Dec 13 13:31:38.651764 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1692523604.mount: Deactivated successfully. Dec 13 13:31:39.565738 containerd[1480]: time="2024-12-13T13:31:39.565683686Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:39.568486 containerd[1480]: time="2024-12-13T13:31:39.568042445Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485461" Dec 13 13:31:39.569892 containerd[1480]: time="2024-12-13T13:31:39.569718592Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:39.575158 containerd[1480]: time="2024-12-13T13:31:39.575100962Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:39.576609 containerd[1480]: time="2024-12-13T13:31:39.576560306Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.554981287s" Dec 13 13:31:39.576695 containerd[1480]: time="2024-12-13T13:31:39.576614747Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Dec 13 13:31:39.609837 containerd[1480]: time="2024-12-13T13:31:39.609582453Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Dec 13 13:31:40.144535 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3251801537.mount: Deactivated successfully. Dec 13 13:31:40.153591 containerd[1480]: time="2024-12-13T13:31:40.153012414Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:40.154574 containerd[1480]: time="2024-12-13T13:31:40.154477159Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268841" Dec 13 13:31:40.155913 containerd[1480]: time="2024-12-13T13:31:40.155852423Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:40.160528 containerd[1480]: time="2024-12-13T13:31:40.159781491Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:40.160704 containerd[1480]: time="2024-12-13T13:31:40.160611545Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 550.982852ms" Dec 13 13:31:40.160704 containerd[1480]: time="2024-12-13T13:31:40.160645826Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Dec 13 13:31:40.182742 containerd[1480]: time="2024-12-13T13:31:40.182704249Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Dec 13 13:31:40.782709 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1101338606.mount: Deactivated successfully. Dec 13 13:31:43.691586 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Dec 13 13:31:43.700227 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:43.831705 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:43.841553 (kubelet)[2297]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 13 13:31:43.898555 kubelet[2297]: E1213 13:31:43.897111 2297 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 13 13:31:43.901178 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 13 13:31:43.901350 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 13 13:31:44.236909 containerd[1480]: time="2024-12-13T13:31:44.236843941Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:44.286307 containerd[1480]: time="2024-12-13T13:31:44.286222820Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=65200866" Dec 13 13:31:44.381537 containerd[1480]: time="2024-12-13T13:31:44.380601491Z" level=info msg="ImageCreate event name:\"sha256:79f8d13ae8b8839cadfb2f83416935f5184206d386028e2d1263577f0ab3620b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:44.392010 containerd[1480]: time="2024-12-13T13:31:44.391917040Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:31:44.393063 containerd[1480]: time="2024-12-13T13:31:44.393022022Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:79f8d13ae8b8839cadfb2f83416935f5184206d386028e2d1263577f0ab3620b\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"65198393\" in 4.210273493s" Dec 13 13:31:44.393283 containerd[1480]: time="2024-12-13T13:31:44.393261907Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:79f8d13ae8b8839cadfb2f83416935f5184206d386028e2d1263577f0ab3620b\"" Dec 13 13:31:50.770880 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:50.785221 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:50.815521 systemd[1]: Reloading requested from client PID 2371 ('systemctl') (unit session-7.scope)... Dec 13 13:31:50.815542 systemd[1]: Reloading... Dec 13 13:31:50.943522 zram_generator::config[2414]: No configuration found. Dec 13 13:31:51.052203 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Dec 13 13:31:51.127760 systemd[1]: Reloading finished in 311 ms. Dec 13 13:31:51.187105 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 13 13:31:51.187280 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 13 13:31:51.187963 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:51.193829 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:51.325110 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:51.344987 (kubelet)[2459]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 13 13:31:51.408010 kubelet[2459]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 13 13:31:51.408010 kubelet[2459]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 13 13:31:51.408010 kubelet[2459]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 13 13:31:51.408436 kubelet[2459]: I1213 13:31:51.408113 2459 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 13 13:31:52.189617 kubelet[2459]: I1213 13:31:52.188911 2459 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Dec 13 13:31:52.189617 kubelet[2459]: I1213 13:31:52.188948 2459 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 13 13:31:52.189617 kubelet[2459]: I1213 13:31:52.189182 2459 server.go:919] "Client rotation is on, will bootstrap in background" Dec 13 13:31:52.225129 kubelet[2459]: E1213 13:31:52.224364 2459 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://188.245.225.138:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.225129 kubelet[2459]: I1213 13:31:52.224465 2459 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 13 13:31:52.234648 kubelet[2459]: I1213 13:31:52.234607 2459 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 13 13:31:52.236549 kubelet[2459]: I1213 13:31:52.236490 2459 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 13 13:31:52.237340 kubelet[2459]: I1213 13:31:52.236922 2459 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Dec 13 13:31:52.237340 kubelet[2459]: I1213 13:31:52.236955 2459 topology_manager.go:138] "Creating topology manager with none policy" Dec 13 13:31:52.237340 kubelet[2459]: I1213 13:31:52.236965 2459 container_manager_linux.go:301] "Creating device plugin manager" Dec 13 13:31:52.237340 kubelet[2459]: I1213 13:31:52.237133 2459 state_mem.go:36] "Initialized new in-memory state store" Dec 13 13:31:52.240454 kubelet[2459]: I1213 13:31:52.240388 2459 kubelet.go:396] "Attempting to sync node with API server" Dec 13 13:31:52.241385 kubelet[2459]: I1213 13:31:52.241111 2459 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 13 13:31:52.241385 kubelet[2459]: I1213 13:31:52.241156 2459 kubelet.go:312] "Adding apiserver pod source" Dec 13 13:31:52.241385 kubelet[2459]: I1213 13:31:52.241172 2459 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 13 13:31:52.241385 kubelet[2459]: W1213 13:31:52.241209 2459 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://188.245.225.138:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-0-0-4-8ed7fad560&limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.241385 kubelet[2459]: E1213 13:31:52.241273 2459 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://188.245.225.138:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-0-0-4-8ed7fad560&limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.243465 kubelet[2459]: W1213 13:31:52.243392 2459 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://188.245.225.138:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.243465 kubelet[2459]: E1213 13:31:52.243443 2459 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://188.245.225.138:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.244481 kubelet[2459]: I1213 13:31:52.243690 2459 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Dec 13 13:31:52.244481 kubelet[2459]: I1213 13:31:52.244224 2459 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 13 13:31:52.245009 kubelet[2459]: W1213 13:31:52.244989 2459 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 13 13:31:52.246024 kubelet[2459]: I1213 13:31:52.245999 2459 server.go:1256] "Started kubelet" Dec 13 13:31:52.249005 kubelet[2459]: I1213 13:31:52.248875 2459 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Dec 13 13:31:52.249780 kubelet[2459]: I1213 13:31:52.249756 2459 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 13 13:31:52.250241 kubelet[2459]: I1213 13:31:52.250220 2459 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 13 13:31:52.250338 kubelet[2459]: I1213 13:31:52.249764 2459 server.go:461] "Adding debug handlers to kubelet server" Dec 13 13:31:52.254155 kubelet[2459]: I1213 13:31:52.254114 2459 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 13 13:31:52.255208 kubelet[2459]: E1213 13:31:52.255119 2459 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://188.245.225.138:6443/api/v1/namespaces/default/events\": dial tcp 188.245.225.138:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4186-0-0-4-8ed7fad560.1810bfc5ab98107e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4186-0-0-4-8ed7fad560,UID:ci-4186-0-0-4-8ed7fad560,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4186-0-0-4-8ed7fad560,},FirstTimestamp:2024-12-13 13:31:52.245973118 +0000 UTC m=+0.894777694,LastTimestamp:2024-12-13 13:31:52.245973118 +0000 UTC m=+0.894777694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4186-0-0-4-8ed7fad560,}" Dec 13 13:31:52.259790 kubelet[2459]: I1213 13:31:52.259318 2459 volume_manager.go:291] "Starting Kubelet Volume Manager" Dec 13 13:31:52.260187 kubelet[2459]: I1213 13:31:52.260153 2459 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Dec 13 13:31:52.260467 kubelet[2459]: I1213 13:31:52.260436 2459 reconciler_new.go:29] "Reconciler: start to sync state" Dec 13 13:31:52.262302 kubelet[2459]: W1213 13:31:52.262221 2459 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://188.245.225.138:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.262302 kubelet[2459]: E1213 13:31:52.262291 2459 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://188.245.225.138:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.262441 kubelet[2459]: E1213 13:31:52.262373 2459 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.225.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-0-0-4-8ed7fad560?timeout=10s\": dial tcp 188.245.225.138:6443: connect: connection refused" interval="200ms" Dec 13 13:31:52.264122 kubelet[2459]: I1213 13:31:52.264023 2459 factory.go:221] Registration of the systemd container factory successfully Dec 13 13:31:52.264733 kubelet[2459]: I1213 13:31:52.264215 2459 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 13 13:31:52.268290 kubelet[2459]: E1213 13:31:52.268248 2459 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 13 13:31:52.269364 kubelet[2459]: I1213 13:31:52.268582 2459 factory.go:221] Registration of the containerd container factory successfully Dec 13 13:31:52.297890 kubelet[2459]: I1213 13:31:52.297860 2459 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 13 13:31:52.300776 kubelet[2459]: I1213 13:31:52.300738 2459 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 13 13:31:52.300776 kubelet[2459]: I1213 13:31:52.300766 2459 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 13 13:31:52.300776 kubelet[2459]: I1213 13:31:52.300783 2459 kubelet.go:2329] "Starting kubelet main sync loop" Dec 13 13:31:52.300959 kubelet[2459]: E1213 13:31:52.300825 2459 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 13 13:31:52.302308 kubelet[2459]: I1213 13:31:52.302279 2459 cpu_manager.go:214] "Starting CPU manager" policy="none" Dec 13 13:31:52.302599 kubelet[2459]: I1213 13:31:52.302574 2459 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Dec 13 13:31:52.302668 kubelet[2459]: I1213 13:31:52.302607 2459 state_mem.go:36] "Initialized new in-memory state store" Dec 13 13:31:52.304238 kubelet[2459]: W1213 13:31:52.304167 2459 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://188.245.225.138:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.305743 kubelet[2459]: E1213 13:31:52.305709 2459 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://188.245.225.138:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:52.306821 kubelet[2459]: I1213 13:31:52.306669 2459 policy_none.go:49] "None policy: Start" Dec 13 13:31:52.308432 kubelet[2459]: I1213 13:31:52.308403 2459 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 13 13:31:52.309181 kubelet[2459]: I1213 13:31:52.308789 2459 state_mem.go:35] "Initializing new in-memory state store" Dec 13 13:31:52.316099 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 13 13:31:52.331264 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 13 13:31:52.335477 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 13 13:31:52.344983 kubelet[2459]: I1213 13:31:52.344660 2459 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 13 13:31:52.345206 kubelet[2459]: I1213 13:31:52.345179 2459 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 13 13:31:52.348988 kubelet[2459]: E1213 13:31:52.348927 2459 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4186-0-0-4-8ed7fad560\" not found" Dec 13 13:31:52.364581 kubelet[2459]: I1213 13:31:52.363999 2459 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.364581 kubelet[2459]: E1213 13:31:52.364549 2459 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.225.138:6443/api/v1/nodes\": dial tcp 188.245.225.138:6443: connect: connection refused" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.401380 kubelet[2459]: I1213 13:31:52.401281 2459 topology_manager.go:215] "Topology Admit Handler" podUID="cc94f6c454172adfcff39394410207d5" podNamespace="kube-system" podName="kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.405888 kubelet[2459]: I1213 13:31:52.405858 2459 topology_manager.go:215] "Topology Admit Handler" podUID="b7c2689386232c773aa6626cd2da08e8" podNamespace="kube-system" podName="kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.409473 kubelet[2459]: I1213 13:31:52.408733 2459 topology_manager.go:215] "Topology Admit Handler" podUID="c0630d70ed514ab3d2845f9b1e48390f" podNamespace="kube-system" podName="kube-scheduler-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.417972 systemd[1]: Created slice kubepods-burstable-podcc94f6c454172adfcff39394410207d5.slice - libcontainer container kubepods-burstable-podcc94f6c454172adfcff39394410207d5.slice. Dec 13 13:31:52.443403 systemd[1]: Created slice kubepods-burstable-podb7c2689386232c773aa6626cd2da08e8.slice - libcontainer container kubepods-burstable-podb7c2689386232c773aa6626cd2da08e8.slice. Dec 13 13:31:52.458699 systemd[1]: Created slice kubepods-burstable-podc0630d70ed514ab3d2845f9b1e48390f.slice - libcontainer container kubepods-burstable-podc0630d70ed514ab3d2845f9b1e48390f.slice. Dec 13 13:31:52.464015 kubelet[2459]: E1213 13:31:52.463940 2459 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.225.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-0-0-4-8ed7fad560?timeout=10s\": dial tcp 188.245.225.138:6443: connect: connection refused" interval="400ms" Dec 13 13:31:52.561809 kubelet[2459]: I1213 13:31:52.561355 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cc94f6c454172adfcff39394410207d5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" (UID: \"cc94f6c454172adfcff39394410207d5\") " pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.561809 kubelet[2459]: I1213 13:31:52.561411 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-ca-certs\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.561809 kubelet[2459]: I1213 13:31:52.561434 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-k8s-certs\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.561809 kubelet[2459]: I1213 13:31:52.561460 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.561809 kubelet[2459]: I1213 13:31:52.561483 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cc94f6c454172adfcff39394410207d5-ca-certs\") pod \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" (UID: \"cc94f6c454172adfcff39394410207d5\") " pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.563066 kubelet[2459]: I1213 13:31:52.561533 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cc94f6c454172adfcff39394410207d5-k8s-certs\") pod \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" (UID: \"cc94f6c454172adfcff39394410207d5\") " pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.563066 kubelet[2459]: I1213 13:31:52.561557 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c0630d70ed514ab3d2845f9b1e48390f-kubeconfig\") pod \"kube-scheduler-ci-4186-0-0-4-8ed7fad560\" (UID: \"c0630d70ed514ab3d2845f9b1e48390f\") " pod="kube-system/kube-scheduler-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.563066 kubelet[2459]: I1213 13:31:52.562357 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-flexvolume-dir\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.563066 kubelet[2459]: I1213 13:31:52.562422 2459 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-kubeconfig\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.568675 kubelet[2459]: I1213 13:31:52.567876 2459 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.568675 kubelet[2459]: E1213 13:31:52.568356 2459 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.225.138:6443/api/v1/nodes\": dial tcp 188.245.225.138:6443: connect: connection refused" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.741588 containerd[1480]: time="2024-12-13T13:31:52.741060512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4186-0-0-4-8ed7fad560,Uid:cc94f6c454172adfcff39394410207d5,Namespace:kube-system,Attempt:0,}" Dec 13 13:31:52.756053 containerd[1480]: time="2024-12-13T13:31:52.755790841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4186-0-0-4-8ed7fad560,Uid:b7c2689386232c773aa6626cd2da08e8,Namespace:kube-system,Attempt:0,}" Dec 13 13:31:52.762649 containerd[1480]: time="2024-12-13T13:31:52.762300764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4186-0-0-4-8ed7fad560,Uid:c0630d70ed514ab3d2845f9b1e48390f,Namespace:kube-system,Attempt:0,}" Dec 13 13:31:52.865582 kubelet[2459]: E1213 13:31:52.865539 2459 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.225.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-0-0-4-8ed7fad560?timeout=10s\": dial tcp 188.245.225.138:6443: connect: connection refused" interval="800ms" Dec 13 13:31:52.971872 kubelet[2459]: I1213 13:31:52.971463 2459 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:52.972292 kubelet[2459]: E1213 13:31:52.972210 2459 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.225.138:6443/api/v1/nodes\": dial tcp 188.245.225.138:6443: connect: connection refused" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:53.214787 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1027228205.mount: Deactivated successfully. Dec 13 13:31:53.225252 containerd[1480]: time="2024-12-13T13:31:53.225182908Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 13 13:31:53.230271 containerd[1480]: time="2024-12-13T13:31:53.230189036Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Dec 13 13:31:53.231684 containerd[1480]: time="2024-12-13T13:31:53.231399147Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 13 13:31:53.234388 containerd[1480]: time="2024-12-13T13:31:53.233731207Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 13 13:31:53.235707 containerd[1480]: time="2024-12-13T13:31:53.235670496Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 13 13:31:53.236830 containerd[1480]: time="2024-12-13T13:31:53.236783685Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Dec 13 13:31:53.237955 containerd[1480]: time="2024-12-13T13:31:53.237903673Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Dec 13 13:31:53.239264 containerd[1480]: time="2024-12-13T13:31:53.239218827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 13 13:31:53.241691 containerd[1480]: time="2024-12-13T13:31:53.241656209Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 479.272723ms" Dec 13 13:31:53.243283 containerd[1480]: time="2024-12-13T13:31:53.243245850Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 502.072375ms" Dec 13 13:31:53.246592 containerd[1480]: time="2024-12-13T13:31:53.246364210Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 490.470406ms" Dec 13 13:31:53.345757 kubelet[2459]: W1213 13:31:53.345686 2459 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://188.245.225.138:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-0-0-4-8ed7fad560&limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:53.346965 kubelet[2459]: E1213 13:31:53.346881 2459 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://188.245.225.138:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4186-0-0-4-8ed7fad560&limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:53.406122 containerd[1480]: time="2024-12-13T13:31:53.405837964Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:31:53.406122 containerd[1480]: time="2024-12-13T13:31:53.405933527Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:31:53.406122 containerd[1480]: time="2024-12-13T13:31:53.405948247Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:31:53.406122 containerd[1480]: time="2024-12-13T13:31:53.406075211Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408842161Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408892602Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408908203Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408982365Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408817841Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408874162Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408885402Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:31:53.409370 containerd[1480]: time="2024-12-13T13:31:53.408975685Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:31:53.431802 systemd[1]: Started cri-containerd-785241592a8cfbb9218c4f910be1c02c928b75d114969c957532f37e3b465e21.scope - libcontainer container 785241592a8cfbb9218c4f910be1c02c928b75d114969c957532f37e3b465e21. Dec 13 13:31:53.440591 systemd[1]: Started cri-containerd-46ccf29301f82a8442a09c63c340e77147f0b55d2d52caf58d764cd70dc445e0.scope - libcontainer container 46ccf29301f82a8442a09c63c340e77147f0b55d2d52caf58d764cd70dc445e0. Dec 13 13:31:53.454763 systemd[1]: Started cri-containerd-9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004.scope - libcontainer container 9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004. Dec 13 13:31:53.488531 kubelet[2459]: W1213 13:31:53.488294 2459 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://188.245.225.138:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:53.488531 kubelet[2459]: E1213 13:31:53.488441 2459 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://188.245.225.138:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.225.138:6443: connect: connection refused Dec 13 13:31:53.510725 containerd[1480]: time="2024-12-13T13:31:53.510339235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4186-0-0-4-8ed7fad560,Uid:cc94f6c454172adfcff39394410207d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"785241592a8cfbb9218c4f910be1c02c928b75d114969c957532f37e3b465e21\"" Dec 13 13:31:53.521876 containerd[1480]: time="2024-12-13T13:31:53.521757286Z" level=info msg="CreateContainer within sandbox \"785241592a8cfbb9218c4f910be1c02c928b75d114969c957532f37e3b465e21\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 13 13:31:53.528753 containerd[1480]: time="2024-12-13T13:31:53.528636662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4186-0-0-4-8ed7fad560,Uid:c0630d70ed514ab3d2845f9b1e48390f,Namespace:kube-system,Attempt:0,} returns sandbox id \"46ccf29301f82a8442a09c63c340e77147f0b55d2d52caf58d764cd70dc445e0\"" Dec 13 13:31:53.535899 containerd[1480]: time="2024-12-13T13:31:53.535319913Z" level=info msg="CreateContainer within sandbox \"46ccf29301f82a8442a09c63c340e77147f0b55d2d52caf58d764cd70dc445e0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 13 13:31:53.535899 containerd[1480]: time="2024-12-13T13:31:53.535533958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4186-0-0-4-8ed7fad560,Uid:b7c2689386232c773aa6626cd2da08e8,Namespace:kube-system,Attempt:0,} returns sandbox id \"9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004\"" Dec 13 13:31:53.541164 containerd[1480]: time="2024-12-13T13:31:53.541085260Z" level=info msg="CreateContainer within sandbox \"9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 13 13:31:53.549997 containerd[1480]: time="2024-12-13T13:31:53.549884845Z" level=info msg="CreateContainer within sandbox \"785241592a8cfbb9218c4f910be1c02c928b75d114969c957532f37e3b465e21\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9ffbc314d5b48d63c1b66b4c3403706b729e97c13a9d5f0daaddcf1cb7e1717c\"" Dec 13 13:31:53.551762 containerd[1480]: time="2024-12-13T13:31:53.551709652Z" level=info msg="StartContainer for \"9ffbc314d5b48d63c1b66b4c3403706b729e97c13a9d5f0daaddcf1cb7e1717c\"" Dec 13 13:31:53.577075 containerd[1480]: time="2024-12-13T13:31:53.576935216Z" level=info msg="CreateContainer within sandbox \"46ccf29301f82a8442a09c63c340e77147f0b55d2d52caf58d764cd70dc445e0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad\"" Dec 13 13:31:53.578802 containerd[1480]: time="2024-12-13T13:31:53.578628820Z" level=info msg="StartContainer for \"1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad\"" Dec 13 13:31:53.579002 containerd[1480]: time="2024-12-13T13:31:53.578960748Z" level=info msg="CreateContainer within sandbox \"9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b\"" Dec 13 13:31:53.579745 containerd[1480]: time="2024-12-13T13:31:53.579709247Z" level=info msg="StartContainer for \"c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b\"" Dec 13 13:31:53.591730 systemd[1]: Started cri-containerd-9ffbc314d5b48d63c1b66b4c3403706b729e97c13a9d5f0daaddcf1cb7e1717c.scope - libcontainer container 9ffbc314d5b48d63c1b66b4c3403706b729e97c13a9d5f0daaddcf1cb7e1717c. Dec 13 13:31:53.627776 systemd[1]: Started cri-containerd-1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad.scope - libcontainer container 1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad. Dec 13 13:31:53.643336 systemd[1]: Started cri-containerd-c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b.scope - libcontainer container c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b. Dec 13 13:31:53.651311 containerd[1480]: time="2024-12-13T13:31:53.651177713Z" level=info msg="StartContainer for \"9ffbc314d5b48d63c1b66b4c3403706b729e97c13a9d5f0daaddcf1cb7e1717c\" returns successfully" Dec 13 13:31:53.667283 kubelet[2459]: E1213 13:31:53.667243 2459 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.225.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4186-0-0-4-8ed7fad560?timeout=10s\": dial tcp 188.245.225.138:6443: connect: connection refused" interval="1.6s" Dec 13 13:31:53.693710 containerd[1480]: time="2024-12-13T13:31:53.693660279Z" level=info msg="StartContainer for \"1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad\" returns successfully" Dec 13 13:31:53.715434 containerd[1480]: time="2024-12-13T13:31:53.715171669Z" level=info msg="StartContainer for \"c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b\" returns successfully" Dec 13 13:31:53.776162 kubelet[2459]: I1213 13:31:53.776034 2459 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:53.777012 kubelet[2459]: E1213 13:31:53.776967 2459 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.225.138:6443/api/v1/nodes\": dial tcp 188.245.225.138:6443: connect: connection refused" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:55.380622 kubelet[2459]: I1213 13:31:55.379415 2459 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:55.850897 kubelet[2459]: E1213 13:31:55.850855 2459 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4186-0-0-4-8ed7fad560\" not found" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:55.906529 kubelet[2459]: I1213 13:31:55.906472 2459 kubelet_node_status.go:76] "Successfully registered node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:56.244387 kubelet[2459]: I1213 13:31:56.244344 2459 apiserver.go:52] "Watching apiserver" Dec 13 13:31:56.262333 kubelet[2459]: I1213 13:31:56.261152 2459 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Dec 13 13:31:56.334514 kubelet[2459]: E1213 13:31:56.334455 2459 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.003593 systemd[1]: Reloading requested from client PID 2740 ('systemctl') (unit session-7.scope)... Dec 13 13:31:59.003611 systemd[1]: Reloading... Dec 13 13:31:59.120688 zram_generator::config[2783]: No configuration found. Dec 13 13:31:59.221062 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Dec 13 13:31:59.302853 systemd[1]: Reloading finished in 298 ms. Dec 13 13:31:59.346676 kubelet[2459]: I1213 13:31:59.346608 2459 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 13 13:31:59.347390 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:59.369003 systemd[1]: kubelet.service: Deactivated successfully. Dec 13 13:31:59.370245 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:59.370446 systemd[1]: kubelet.service: Consumed 1.394s CPU time, 113.8M memory peak, 0B memory swap peak. Dec 13 13:31:59.378922 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 13 13:31:59.505212 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 13 13:31:59.517343 (kubelet)[2825]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 13 13:31:59.576761 kubelet[2825]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 13 13:31:59.576761 kubelet[2825]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 13 13:31:59.576761 kubelet[2825]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 13 13:31:59.576761 kubelet[2825]: I1213 13:31:59.574959 2825 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 13 13:31:59.579801 kubelet[2825]: I1213 13:31:59.579768 2825 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Dec 13 13:31:59.579801 kubelet[2825]: I1213 13:31:59.579798 2825 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 13 13:31:59.580012 kubelet[2825]: I1213 13:31:59.579997 2825 server.go:919] "Client rotation is on, will bootstrap in background" Dec 13 13:31:59.583544 kubelet[2825]: I1213 13:31:59.583452 2825 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 13 13:31:59.586271 kubelet[2825]: I1213 13:31:59.585486 2825 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 13 13:31:59.608242 kubelet[2825]: I1213 13:31:59.607856 2825 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 13 13:31:59.608242 kubelet[2825]: I1213 13:31:59.608182 2825 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 13 13:31:59.609286 kubelet[2825]: I1213 13:31:59.609243 2825 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Dec 13 13:31:59.609431 kubelet[2825]: I1213 13:31:59.609296 2825 topology_manager.go:138] "Creating topology manager with none policy" Dec 13 13:31:59.609431 kubelet[2825]: I1213 13:31:59.609312 2825 container_manager_linux.go:301] "Creating device plugin manager" Dec 13 13:31:59.609431 kubelet[2825]: I1213 13:31:59.609362 2825 state_mem.go:36] "Initialized new in-memory state store" Dec 13 13:31:59.610170 kubelet[2825]: I1213 13:31:59.609565 2825 kubelet.go:396] "Attempting to sync node with API server" Dec 13 13:31:59.610170 kubelet[2825]: I1213 13:31:59.609588 2825 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 13 13:31:59.610170 kubelet[2825]: I1213 13:31:59.609626 2825 kubelet.go:312] "Adding apiserver pod source" Dec 13 13:31:59.610170 kubelet[2825]: I1213 13:31:59.609642 2825 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 13 13:31:59.616542 kubelet[2825]: I1213 13:31:59.614282 2825 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Dec 13 13:31:59.616542 kubelet[2825]: I1213 13:31:59.614963 2825 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 13 13:31:59.616542 kubelet[2825]: I1213 13:31:59.616035 2825 server.go:1256] "Started kubelet" Dec 13 13:31:59.620418 kubelet[2825]: I1213 13:31:59.620364 2825 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 13 13:31:59.625010 kubelet[2825]: I1213 13:31:59.624970 2825 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Dec 13 13:31:59.629585 kubelet[2825]: I1213 13:31:59.629554 2825 server.go:461] "Adding debug handlers to kubelet server" Dec 13 13:31:59.630866 kubelet[2825]: I1213 13:31:59.630838 2825 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 13 13:31:59.632892 kubelet[2825]: I1213 13:31:59.632872 2825 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 13 13:31:59.638420 kubelet[2825]: I1213 13:31:59.638389 2825 volume_manager.go:291] "Starting Kubelet Volume Manager" Dec 13 13:31:59.642447 kubelet[2825]: I1213 13:31:59.638638 2825 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Dec 13 13:31:59.645554 kubelet[2825]: I1213 13:31:59.645520 2825 reconciler_new.go:29] "Reconciler: start to sync state" Dec 13 13:31:59.675962 kubelet[2825]: I1213 13:31:59.675932 2825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 13 13:31:59.678965 kubelet[2825]: I1213 13:31:59.677898 2825 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 13 13:31:59.679460 kubelet[2825]: I1213 13:31:59.679431 2825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 13 13:31:59.679574 kubelet[2825]: I1213 13:31:59.679562 2825 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 13 13:31:59.680170 kubelet[2825]: I1213 13:31:59.680151 2825 kubelet.go:2329] "Starting kubelet main sync loop" Dec 13 13:31:59.681867 kubelet[2825]: E1213 13:31:59.680283 2825 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 13 13:31:59.690778 kubelet[2825]: E1213 13:31:59.690355 2825 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 13 13:31:59.693302 kubelet[2825]: I1213 13:31:59.693260 2825 factory.go:221] Registration of the containerd container factory successfully Dec 13 13:31:59.693302 kubelet[2825]: I1213 13:31:59.693289 2825 factory.go:221] Registration of the systemd container factory successfully Dec 13 13:31:59.745447 kubelet[2825]: I1213 13:31:59.745408 2825 kubelet_node_status.go:73] "Attempting to register node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.758279 kubelet[2825]: I1213 13:31:59.757981 2825 kubelet_node_status.go:112] "Node was previously registered" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.758279 kubelet[2825]: I1213 13:31:59.758220 2825 kubelet_node_status.go:76] "Successfully registered node" node="ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.783333 kubelet[2825]: E1213 13:31:59.782843 2825 kubelet.go:2353] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 13 13:31:59.786483 kubelet[2825]: I1213 13:31:59.786433 2825 cpu_manager.go:214] "Starting CPU manager" policy="none" Dec 13 13:31:59.786483 kubelet[2825]: I1213 13:31:59.786467 2825 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Dec 13 13:31:59.786483 kubelet[2825]: I1213 13:31:59.786490 2825 state_mem.go:36] "Initialized new in-memory state store" Dec 13 13:31:59.786842 kubelet[2825]: I1213 13:31:59.786660 2825 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 13 13:31:59.786842 kubelet[2825]: I1213 13:31:59.786683 2825 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 13 13:31:59.786842 kubelet[2825]: I1213 13:31:59.786690 2825 policy_none.go:49] "None policy: Start" Dec 13 13:31:59.787769 kubelet[2825]: I1213 13:31:59.787739 2825 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 13 13:31:59.787769 kubelet[2825]: I1213 13:31:59.787776 2825 state_mem.go:35] "Initializing new in-memory state store" Dec 13 13:31:59.788242 kubelet[2825]: I1213 13:31:59.788104 2825 state_mem.go:75] "Updated machine memory state" Dec 13 13:31:59.793559 kubelet[2825]: I1213 13:31:59.793519 2825 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 13 13:31:59.794468 kubelet[2825]: I1213 13:31:59.794445 2825 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 13 13:31:59.986546 kubelet[2825]: I1213 13:31:59.983193 2825 topology_manager.go:215] "Topology Admit Handler" podUID="cc94f6c454172adfcff39394410207d5" podNamespace="kube-system" podName="kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.986546 kubelet[2825]: I1213 13:31:59.983305 2825 topology_manager.go:215] "Topology Admit Handler" podUID="b7c2689386232c773aa6626cd2da08e8" podNamespace="kube-system" podName="kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.986546 kubelet[2825]: I1213 13:31:59.983375 2825 topology_manager.go:215] "Topology Admit Handler" podUID="c0630d70ed514ab3d2845f9b1e48390f" podNamespace="kube-system" podName="kube-scheduler-ci-4186-0-0-4-8ed7fad560" Dec 13 13:31:59.994277 kubelet[2825]: E1213 13:31:59.994214 2825 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" already exists" pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064211 kubelet[2825]: I1213 13:32:00.063724 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c0630d70ed514ab3d2845f9b1e48390f-kubeconfig\") pod \"kube-scheduler-ci-4186-0-0-4-8ed7fad560\" (UID: \"c0630d70ed514ab3d2845f9b1e48390f\") " pod="kube-system/kube-scheduler-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064211 kubelet[2825]: I1213 13:32:00.063787 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cc94f6c454172adfcff39394410207d5-k8s-certs\") pod \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" (UID: \"cc94f6c454172adfcff39394410207d5\") " pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064211 kubelet[2825]: I1213 13:32:00.063828 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-k8s-certs\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064211 kubelet[2825]: I1213 13:32:00.063861 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-ca-certs\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064211 kubelet[2825]: I1213 13:32:00.063891 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-flexvolume-dir\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064535 kubelet[2825]: I1213 13:32:00.063918 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-kubeconfig\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064535 kubelet[2825]: I1213 13:32:00.063945 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b7c2689386232c773aa6626cd2da08e8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4186-0-0-4-8ed7fad560\" (UID: \"b7c2689386232c773aa6626cd2da08e8\") " pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064535 kubelet[2825]: I1213 13:32:00.063983 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cc94f6c454172adfcff39394410207d5-ca-certs\") pod \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" (UID: \"cc94f6c454172adfcff39394410207d5\") " pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.064535 kubelet[2825]: I1213 13:32:00.064020 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cc94f6c454172adfcff39394410207d5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4186-0-0-4-8ed7fad560\" (UID: \"cc94f6c454172adfcff39394410207d5\") " pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:00.610809 kubelet[2825]: I1213 13:32:00.610752 2825 apiserver.go:52] "Watching apiserver" Dec 13 13:32:00.643737 kubelet[2825]: I1213 13:32:00.643680 2825 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Dec 13 13:32:00.810824 kubelet[2825]: I1213 13:32:00.810632 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4186-0-0-4-8ed7fad560" podStartSLOduration=1.8105797300000002 podStartE2EDuration="1.81057973s" podCreationTimestamp="2024-12-13 13:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-12-13 13:32:00.787909598 +0000 UTC m=+1.265216860" watchObservedRunningTime="2024-12-13 13:32:00.81057973 +0000 UTC m=+1.287886992" Dec 13 13:32:00.829001 kubelet[2825]: I1213 13:32:00.828797 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4186-0-0-4-8ed7fad560" podStartSLOduration=1.828736572 podStartE2EDuration="1.828736572s" podCreationTimestamp="2024-12-13 13:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-12-13 13:32:00.812176336 +0000 UTC m=+1.289483598" watchObservedRunningTime="2024-12-13 13:32:00.828736572 +0000 UTC m=+1.306043834" Dec 13 13:32:00.831322 kubelet[2825]: I1213 13:32:00.831070 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4186-0-0-4-8ed7fad560" podStartSLOduration=3.831001917 podStartE2EDuration="3.831001917s" podCreationTimestamp="2024-12-13 13:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-12-13 13:32:00.830823192 +0000 UTC m=+1.308130454" watchObservedRunningTime="2024-12-13 13:32:00.831001917 +0000 UTC m=+1.308309219" Dec 13 13:32:05.005045 sudo[1871]: pam_unix(sudo:session): session closed for user root Dec 13 13:32:05.163338 sshd[1870]: Connection closed by 147.75.109.163 port 45592 Dec 13 13:32:05.164034 sshd-session[1868]: pam_unix(sshd:session): session closed for user core Dec 13 13:32:05.169137 systemd[1]: sshd@6-188.245.225.138:22-147.75.109.163:45592.service: Deactivated successfully. Dec 13 13:32:05.172644 systemd[1]: session-7.scope: Deactivated successfully. Dec 13 13:32:05.172835 systemd[1]: session-7.scope: Consumed 8.264s CPU time, 187.1M memory peak, 0B memory swap peak. Dec 13 13:32:05.173872 systemd-logind[1464]: Session 7 logged out. Waiting for processes to exit. Dec 13 13:32:05.175978 systemd-logind[1464]: Removed session 7. Dec 13 13:32:14.243916 kubelet[2825]: I1213 13:32:14.241611 2825 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 13 13:32:14.244418 containerd[1480]: time="2024-12-13T13:32:14.243788162Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 13 13:32:14.245390 kubelet[2825]: I1213 13:32:14.244952 2825 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 13 13:32:15.131857 kubelet[2825]: I1213 13:32:15.131799 2825 topology_manager.go:215] "Topology Admit Handler" podUID="ef58c2a0-d5bc-407b-9a26-f516eb24258c" podNamespace="kube-system" podName="kube-proxy-qnhw7" Dec 13 13:32:15.146150 systemd[1]: Created slice kubepods-besteffort-podef58c2a0_d5bc_407b_9a26_f516eb24258c.slice - libcontainer container kubepods-besteffort-podef58c2a0_d5bc_407b_9a26_f516eb24258c.slice. Dec 13 13:32:15.166008 kubelet[2825]: I1213 13:32:15.165959 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h9rx\" (UniqueName: \"kubernetes.io/projected/ef58c2a0-d5bc-407b-9a26-f516eb24258c-kube-api-access-7h9rx\") pod \"kube-proxy-qnhw7\" (UID: \"ef58c2a0-d5bc-407b-9a26-f516eb24258c\") " pod="kube-system/kube-proxy-qnhw7" Dec 13 13:32:15.166008 kubelet[2825]: I1213 13:32:15.166016 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ef58c2a0-d5bc-407b-9a26-f516eb24258c-xtables-lock\") pod \"kube-proxy-qnhw7\" (UID: \"ef58c2a0-d5bc-407b-9a26-f516eb24258c\") " pod="kube-system/kube-proxy-qnhw7" Dec 13 13:32:15.166212 kubelet[2825]: I1213 13:32:15.166066 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ef58c2a0-d5bc-407b-9a26-f516eb24258c-kube-proxy\") pod \"kube-proxy-qnhw7\" (UID: \"ef58c2a0-d5bc-407b-9a26-f516eb24258c\") " pod="kube-system/kube-proxy-qnhw7" Dec 13 13:32:15.166212 kubelet[2825]: I1213 13:32:15.166089 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef58c2a0-d5bc-407b-9a26-f516eb24258c-lib-modules\") pod \"kube-proxy-qnhw7\" (UID: \"ef58c2a0-d5bc-407b-9a26-f516eb24258c\") " pod="kube-system/kube-proxy-qnhw7" Dec 13 13:32:15.307360 kubelet[2825]: I1213 13:32:15.307306 2825 topology_manager.go:215] "Topology Admit Handler" podUID="7dbec970-de14-4ab3-8181-5e888a7371e4" podNamespace="tigera-operator" podName="tigera-operator-c7ccbd65-8kfhh" Dec 13 13:32:15.318629 systemd[1]: Created slice kubepods-besteffort-pod7dbec970_de14_4ab3_8181_5e888a7371e4.slice - libcontainer container kubepods-besteffort-pod7dbec970_de14_4ab3_8181_5e888a7371e4.slice. Dec 13 13:32:15.367816 kubelet[2825]: I1213 13:32:15.367764 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7dbec970-de14-4ab3-8181-5e888a7371e4-var-lib-calico\") pod \"tigera-operator-c7ccbd65-8kfhh\" (UID: \"7dbec970-de14-4ab3-8181-5e888a7371e4\") " pod="tigera-operator/tigera-operator-c7ccbd65-8kfhh" Dec 13 13:32:15.368017 kubelet[2825]: I1213 13:32:15.367860 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5g7\" (UniqueName: \"kubernetes.io/projected/7dbec970-de14-4ab3-8181-5e888a7371e4-kube-api-access-2w5g7\") pod \"tigera-operator-c7ccbd65-8kfhh\" (UID: \"7dbec970-de14-4ab3-8181-5e888a7371e4\") " pod="tigera-operator/tigera-operator-c7ccbd65-8kfhh" Dec 13 13:32:15.455050 containerd[1480]: time="2024-12-13T13:32:15.454962570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qnhw7,Uid:ef58c2a0-d5bc-407b-9a26-f516eb24258c,Namespace:kube-system,Attempt:0,}" Dec 13 13:32:15.509320 containerd[1480]: time="2024-12-13T13:32:15.507530258Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:15.509320 containerd[1480]: time="2024-12-13T13:32:15.507638141Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:15.509320 containerd[1480]: time="2024-12-13T13:32:15.507651662Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:15.509320 containerd[1480]: time="2024-12-13T13:32:15.507766506Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:15.534937 systemd[1]: Started cri-containerd-fb3b58c85255e147ee90db3d58dc3b4b0a53446d33d160cc100e33f84a8d8738.scope - libcontainer container fb3b58c85255e147ee90db3d58dc3b4b0a53446d33d160cc100e33f84a8d8738. Dec 13 13:32:15.563541 containerd[1480]: time="2024-12-13T13:32:15.563488419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qnhw7,Uid:ef58c2a0-d5bc-407b-9a26-f516eb24258c,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb3b58c85255e147ee90db3d58dc3b4b0a53446d33d160cc100e33f84a8d8738\"" Dec 13 13:32:15.568527 containerd[1480]: time="2024-12-13T13:32:15.568451906Z" level=info msg="CreateContainer within sandbox \"fb3b58c85255e147ee90db3d58dc3b4b0a53446d33d160cc100e33f84a8d8738\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 13 13:32:15.587788 containerd[1480]: time="2024-12-13T13:32:15.587708074Z" level=info msg="CreateContainer within sandbox \"fb3b58c85255e147ee90db3d58dc3b4b0a53446d33d160cc100e33f84a8d8738\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"94d73b14de6d671841ab553e539c6828e6ab310b96f6f88fa32c6b9fcc6a67d7\"" Dec 13 13:32:15.589631 containerd[1480]: time="2024-12-13T13:32:15.589159282Z" level=info msg="StartContainer for \"94d73b14de6d671841ab553e539c6828e6ab310b96f6f88fa32c6b9fcc6a67d7\"" Dec 13 13:32:15.623067 containerd[1480]: time="2024-12-13T13:32:15.622551085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-8kfhh,Uid:7dbec970-de14-4ab3-8181-5e888a7371e4,Namespace:tigera-operator,Attempt:0,}" Dec 13 13:32:15.628939 systemd[1]: Started cri-containerd-94d73b14de6d671841ab553e539c6828e6ab310b96f6f88fa32c6b9fcc6a67d7.scope - libcontainer container 94d73b14de6d671841ab553e539c6828e6ab310b96f6f88fa32c6b9fcc6a67d7. Dec 13 13:32:15.670671 containerd[1480]: time="2024-12-13T13:32:15.670489857Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:15.671430 containerd[1480]: time="2024-12-13T13:32:15.671298444Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:15.671430 containerd[1480]: time="2024-12-13T13:32:15.671382367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:15.671807 containerd[1480]: time="2024-12-13T13:32:15.671766380Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:15.677767 containerd[1480]: time="2024-12-13T13:32:15.677718420Z" level=info msg="StartContainer for \"94d73b14de6d671841ab553e539c6828e6ab310b96f6f88fa32c6b9fcc6a67d7\" returns successfully" Dec 13 13:32:15.706763 systemd[1]: Started cri-containerd-6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a.scope - libcontainer container 6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a. Dec 13 13:32:15.755578 containerd[1480]: time="2024-12-13T13:32:15.755530676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-8kfhh,Uid:7dbec970-de14-4ab3-8181-5e888a7371e4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a\"" Dec 13 13:32:15.759799 containerd[1480]: time="2024-12-13T13:32:15.759245681Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Dec 13 13:32:15.810020 kubelet[2825]: I1213 13:32:15.809983 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-qnhw7" podStartSLOduration=0.809849983 podStartE2EDuration="809.849983ms" podCreationTimestamp="2024-12-13 13:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-12-13 13:32:15.809737379 +0000 UTC m=+16.287044681" watchObservedRunningTime="2024-12-13 13:32:15.809849983 +0000 UTC m=+16.287157285" Dec 13 13:32:18.154585 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2713863016.mount: Deactivated successfully. Dec 13 13:32:18.735815 containerd[1480]: time="2024-12-13T13:32:18.735511426Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:18.737705 containerd[1480]: time="2024-12-13T13:32:18.737640699Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=19125956" Dec 13 13:32:18.739908 containerd[1480]: time="2024-12-13T13:32:18.738939704Z" level=info msg="ImageCreate event name:\"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:18.743808 containerd[1480]: time="2024-12-13T13:32:18.743748309Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:18.744679 containerd[1480]: time="2024-12-13T13:32:18.744624059Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"19120155\" in 2.985336257s" Dec 13 13:32:18.744679 containerd[1480]: time="2024-12-13T13:32:18.744674661Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\"" Dec 13 13:32:18.750407 containerd[1480]: time="2024-12-13T13:32:18.750066126Z" level=info msg="CreateContainer within sandbox \"6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 13 13:32:18.777335 containerd[1480]: time="2024-12-13T13:32:18.777285701Z" level=info msg="CreateContainer within sandbox \"6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469\"" Dec 13 13:32:18.778331 containerd[1480]: time="2024-12-13T13:32:18.778296576Z" level=info msg="StartContainer for \"906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469\"" Dec 13 13:32:18.813797 systemd[1]: Started cri-containerd-906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469.scope - libcontainer container 906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469. Dec 13 13:32:18.848010 containerd[1480]: time="2024-12-13T13:32:18.847935768Z" level=info msg="StartContainer for \"906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469\" returns successfully" Dec 13 13:32:23.559841 kubelet[2825]: I1213 13:32:23.559785 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-c7ccbd65-8kfhh" podStartSLOduration=5.572084272 podStartE2EDuration="8.559742847s" podCreationTimestamp="2024-12-13 13:32:15 +0000 UTC" firstStartedPulling="2024-12-13 13:32:15.757477782 +0000 UTC m=+16.234785044" lastFinishedPulling="2024-12-13 13:32:18.745136357 +0000 UTC m=+19.222443619" observedRunningTime="2024-12-13 13:32:19.823210854 +0000 UTC m=+20.300518116" watchObservedRunningTime="2024-12-13 13:32:23.559742847 +0000 UTC m=+24.037050069" Dec 13 13:32:23.561201 kubelet[2825]: I1213 13:32:23.559923 2825 topology_manager.go:215] "Topology Admit Handler" podUID="4bb545d2-7125-4865-bac7-b977a16b0037" podNamespace="calico-system" podName="calico-typha-94754b74d-bdxw7" Dec 13 13:32:23.570831 systemd[1]: Created slice kubepods-besteffort-pod4bb545d2_7125_4865_bac7_b977a16b0037.slice - libcontainer container kubepods-besteffort-pod4bb545d2_7125_4865_bac7_b977a16b0037.slice. Dec 13 13:32:23.628741 kubelet[2825]: I1213 13:32:23.627469 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4bb545d2-7125-4865-bac7-b977a16b0037-typha-certs\") pod \"calico-typha-94754b74d-bdxw7\" (UID: \"4bb545d2-7125-4865-bac7-b977a16b0037\") " pod="calico-system/calico-typha-94754b74d-bdxw7" Dec 13 13:32:23.628741 kubelet[2825]: I1213 13:32:23.627713 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmfr\" (UniqueName: \"kubernetes.io/projected/4bb545d2-7125-4865-bac7-b977a16b0037-kube-api-access-vmmfr\") pod \"calico-typha-94754b74d-bdxw7\" (UID: \"4bb545d2-7125-4865-bac7-b977a16b0037\") " pod="calico-system/calico-typha-94754b74d-bdxw7" Dec 13 13:32:23.628741 kubelet[2825]: I1213 13:32:23.627836 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb545d2-7125-4865-bac7-b977a16b0037-tigera-ca-bundle\") pod \"calico-typha-94754b74d-bdxw7\" (UID: \"4bb545d2-7125-4865-bac7-b977a16b0037\") " pod="calico-system/calico-typha-94754b74d-bdxw7" Dec 13 13:32:23.720881 kubelet[2825]: I1213 13:32:23.720841 2825 topology_manager.go:215] "Topology Admit Handler" podUID="de3e5be1-d7f6-4eb0-b955-783f5b94bf29" podNamespace="calico-system" podName="calico-node-rsmpl" Dec 13 13:32:23.743868 systemd[1]: Created slice kubepods-besteffort-podde3e5be1_d7f6_4eb0_b955_783f5b94bf29.slice - libcontainer container kubepods-besteffort-podde3e5be1_d7f6_4eb0_b955_783f5b94bf29.slice. Dec 13 13:32:23.829982 kubelet[2825]: I1213 13:32:23.829314 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-node-certs\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.829982 kubelet[2825]: I1213 13:32:23.829373 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-var-lib-calico\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.829982 kubelet[2825]: I1213 13:32:23.829405 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-flexvol-driver-host\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.829982 kubelet[2825]: I1213 13:32:23.829427 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-lib-modules\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.829982 kubelet[2825]: I1213 13:32:23.829448 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-policysync\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830252 kubelet[2825]: I1213 13:32:23.829467 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-cni-net-dir\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830252 kubelet[2825]: I1213 13:32:23.829489 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-cni-bin-dir\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830252 kubelet[2825]: I1213 13:32:23.829521 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-cni-log-dir\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830252 kubelet[2825]: I1213 13:32:23.829545 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-xtables-lock\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830252 kubelet[2825]: I1213 13:32:23.829614 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-tigera-ca-bundle\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830448 kubelet[2825]: I1213 13:32:23.829638 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-var-run-calico\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.830448 kubelet[2825]: I1213 13:32:23.829662 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwzp\" (UniqueName: \"kubernetes.io/projected/de3e5be1-d7f6-4eb0-b955-783f5b94bf29-kube-api-access-wdwzp\") pod \"calico-node-rsmpl\" (UID: \"de3e5be1-d7f6-4eb0-b955-783f5b94bf29\") " pod="calico-system/calico-node-rsmpl" Dec 13 13:32:23.877339 containerd[1480]: time="2024-12-13T13:32:23.877280572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-94754b74d-bdxw7,Uid:4bb545d2-7125-4865-bac7-b977a16b0037,Namespace:calico-system,Attempt:0,}" Dec 13 13:32:23.923214 containerd[1480]: time="2024-12-13T13:32:23.923102355Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:23.925781 containerd[1480]: time="2024-12-13T13:32:23.925445118Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:23.925781 containerd[1480]: time="2024-12-13T13:32:23.925474719Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:23.925781 containerd[1480]: time="2024-12-13T13:32:23.925631845Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:23.939394 kubelet[2825]: I1213 13:32:23.938487 2825 topology_manager.go:215] "Topology Admit Handler" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" podNamespace="calico-system" podName="csi-node-driver-gvpdf" Dec 13 13:32:23.941703 kubelet[2825]: E1213 13:32:23.941016 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:23.954456 kubelet[2825]: E1213 13:32:23.954414 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:23.954456 kubelet[2825]: W1213 13:32:23.954441 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:23.954640 kubelet[2825]: E1213 13:32:23.954516 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:23.971390 kubelet[2825]: E1213 13:32:23.971367 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:23.971592 kubelet[2825]: W1213 13:32:23.971574 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:23.971751 kubelet[2825]: E1213 13:32:23.971735 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:23.972791 systemd[1]: Started cri-containerd-bed24f411b123d4904d4ced5e386d52e2fefcb5a4b7400110270a4f96acb4c4b.scope - libcontainer container bed24f411b123d4904d4ced5e386d52e2fefcb5a4b7400110270a4f96acb4c4b. Dec 13 13:32:23.984308 kubelet[2825]: E1213 13:32:23.984280 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:23.984537 kubelet[2825]: W1213 13:32:23.984456 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:23.984537 kubelet[2825]: E1213 13:32:23.984486 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:23.999689 kubelet[2825]: E1213 13:32:23.999434 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:23.999689 kubelet[2825]: W1213 13:32:23.999460 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:23.999689 kubelet[2825]: E1213 13:32:23.999492 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.000395 kubelet[2825]: E1213 13:32:24.000247 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.000395 kubelet[2825]: W1213 13:32:24.000266 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.000395 kubelet[2825]: E1213 13:32:24.000287 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.001644 kubelet[2825]: E1213 13:32:24.001166 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.001644 kubelet[2825]: W1213 13:32:24.001208 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.001644 kubelet[2825]: E1213 13:32:24.001251 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.003422 kubelet[2825]: E1213 13:32:24.003182 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.003422 kubelet[2825]: W1213 13:32:24.003227 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.003422 kubelet[2825]: E1213 13:32:24.003253 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.004117 kubelet[2825]: E1213 13:32:24.003821 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.004117 kubelet[2825]: W1213 13:32:24.003837 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.004117 kubelet[2825]: E1213 13:32:24.003853 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.006798 kubelet[2825]: E1213 13:32:24.006469 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.006798 kubelet[2825]: W1213 13:32:24.006491 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.006798 kubelet[2825]: E1213 13:32:24.006546 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.007344 kubelet[2825]: E1213 13:32:24.007168 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.007344 kubelet[2825]: W1213 13:32:24.007204 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.007344 kubelet[2825]: E1213 13:32:24.007224 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.008906 kubelet[2825]: E1213 13:32:24.008608 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.008906 kubelet[2825]: W1213 13:32:24.008628 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.008906 kubelet[2825]: E1213 13:32:24.008646 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.009468 kubelet[2825]: E1213 13:32:24.009265 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.009468 kubelet[2825]: W1213 13:32:24.009461 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.009601 kubelet[2825]: E1213 13:32:24.009482 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.010132 kubelet[2825]: E1213 13:32:24.010105 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.010132 kubelet[2825]: W1213 13:32:24.010123 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.010226 kubelet[2825]: E1213 13:32:24.010137 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.010666 kubelet[2825]: E1213 13:32:24.010649 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.010666 kubelet[2825]: W1213 13:32:24.010664 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.010791 kubelet[2825]: E1213 13:32:24.010776 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.011253 kubelet[2825]: E1213 13:32:24.011235 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.011312 kubelet[2825]: W1213 13:32:24.011252 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.011312 kubelet[2825]: E1213 13:32:24.011273 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.011997 kubelet[2825]: E1213 13:32:24.011977 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.012137 kubelet[2825]: W1213 13:32:24.012117 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.012169 kubelet[2825]: E1213 13:32:24.012145 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.012706 kubelet[2825]: E1213 13:32:24.012574 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.012706 kubelet[2825]: W1213 13:32:24.012705 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.012801 kubelet[2825]: E1213 13:32:24.012724 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.013158 kubelet[2825]: E1213 13:32:24.013143 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.013158 kubelet[2825]: W1213 13:32:24.013157 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.013360 kubelet[2825]: E1213 13:32:24.013170 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.013887 kubelet[2825]: E1213 13:32:24.013869 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.013887 kubelet[2825]: W1213 13:32:24.013887 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.013968 kubelet[2825]: E1213 13:32:24.013901 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.014377 kubelet[2825]: E1213 13:32:24.014361 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.014377 kubelet[2825]: W1213 13:32:24.014376 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.014377 kubelet[2825]: E1213 13:32:24.014510 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.016049 kubelet[2825]: E1213 13:32:24.016028 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.016049 kubelet[2825]: W1213 13:32:24.016047 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.016927 kubelet[2825]: E1213 13:32:24.016064 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.016927 kubelet[2825]: E1213 13:32:24.016250 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.016927 kubelet[2825]: W1213 13:32:24.016258 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.016927 kubelet[2825]: E1213 13:32:24.016269 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.017486 kubelet[2825]: E1213 13:32:24.017326 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.017486 kubelet[2825]: W1213 13:32:24.017338 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.017486 kubelet[2825]: E1213 13:32:24.017352 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.035531 kubelet[2825]: E1213 13:32:24.035259 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.036067 kubelet[2825]: W1213 13:32:24.036027 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.036192 kubelet[2825]: E1213 13:32:24.036176 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.036262 kubelet[2825]: I1213 13:32:24.036222 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a99d7c4c-ae69-4d70-a627-2ff0fceee5d5-socket-dir\") pod \"csi-node-driver-gvpdf\" (UID: \"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5\") " pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:24.038739 kubelet[2825]: E1213 13:32:24.038324 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.038739 kubelet[2825]: W1213 13:32:24.038356 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.038902 kubelet[2825]: E1213 13:32:24.038820 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.040129 kubelet[2825]: E1213 13:32:24.039698 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.040129 kubelet[2825]: W1213 13:32:24.039722 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.040129 kubelet[2825]: E1213 13:32:24.039743 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.040129 kubelet[2825]: I1213 13:32:24.039969 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a99d7c4c-ae69-4d70-a627-2ff0fceee5d5-varrun\") pod \"csi-node-driver-gvpdf\" (UID: \"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5\") " pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:24.040686 kubelet[2825]: E1213 13:32:24.040654 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.040686 kubelet[2825]: W1213 13:32:24.040675 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.040781 kubelet[2825]: E1213 13:32:24.040734 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.042945 kubelet[2825]: E1213 13:32:24.042077 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.042945 kubelet[2825]: W1213 13:32:24.042941 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.043141 kubelet[2825]: E1213 13:32:24.042984 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.043474 kubelet[2825]: E1213 13:32:24.043444 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.043474 kubelet[2825]: W1213 13:32:24.043463 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.043814 kubelet[2825]: E1213 13:32:24.043778 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.044153 kubelet[2825]: E1213 13:32:24.044091 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.044153 kubelet[2825]: W1213 13:32:24.044108 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.044153 kubelet[2825]: E1213 13:32:24.044123 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.044454 kubelet[2825]: I1213 13:32:24.044277 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghm6\" (UniqueName: \"kubernetes.io/projected/a99d7c4c-ae69-4d70-a627-2ff0fceee5d5-kube-api-access-gghm6\") pod \"csi-node-driver-gvpdf\" (UID: \"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5\") " pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:24.045010 kubelet[2825]: E1213 13:32:24.044981 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.045010 kubelet[2825]: W1213 13:32:24.045001 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.045099 kubelet[2825]: E1213 13:32:24.045022 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.045933 kubelet[2825]: E1213 13:32:24.045795 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.045933 kubelet[2825]: W1213 13:32:24.045923 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.046073 kubelet[2825]: E1213 13:32:24.046048 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.049157 kubelet[2825]: E1213 13:32:24.049120 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.049157 kubelet[2825]: W1213 13:32:24.049147 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.049157 kubelet[2825]: E1213 13:32:24.049169 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.050689 kubelet[2825]: I1213 13:32:24.049198 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a99d7c4c-ae69-4d70-a627-2ff0fceee5d5-kubelet-dir\") pod \"csi-node-driver-gvpdf\" (UID: \"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5\") " pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:24.050689 kubelet[2825]: E1213 13:32:24.049909 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.050689 kubelet[2825]: W1213 13:32:24.049926 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.050689 kubelet[2825]: E1213 13:32:24.049943 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.050689 kubelet[2825]: I1213 13:32:24.049969 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a99d7c4c-ae69-4d70-a627-2ff0fceee5d5-registration-dir\") pod \"csi-node-driver-gvpdf\" (UID: \"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5\") " pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:24.051720 containerd[1480]: time="2024-12-13T13:32:24.051644717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rsmpl,Uid:de3e5be1-d7f6-4eb0-b955-783f5b94bf29,Namespace:calico-system,Attempt:0,}" Dec 13 13:32:24.052088 kubelet[2825]: E1213 13:32:24.051940 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.052088 kubelet[2825]: W1213 13:32:24.051967 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.052088 kubelet[2825]: E1213 13:32:24.051994 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.052920 kubelet[2825]: E1213 13:32:24.052863 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.054020 kubelet[2825]: W1213 13:32:24.053976 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.054307 kubelet[2825]: E1213 13:32:24.054278 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.055019 kubelet[2825]: E1213 13:32:24.054958 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.055019 kubelet[2825]: W1213 13:32:24.054980 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.055019 kubelet[2825]: E1213 13:32:24.055018 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.056114 kubelet[2825]: E1213 13:32:24.056090 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.056114 kubelet[2825]: W1213 13:32:24.056108 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.056114 kubelet[2825]: E1213 13:32:24.056125 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.082246 containerd[1480]: time="2024-12-13T13:32:24.082095162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-94754b74d-bdxw7,Uid:4bb545d2-7125-4865-bac7-b977a16b0037,Namespace:calico-system,Attempt:0,} returns sandbox id \"bed24f411b123d4904d4ced5e386d52e2fefcb5a4b7400110270a4f96acb4c4b\"" Dec 13 13:32:24.092144 containerd[1480]: time="2024-12-13T13:32:24.092013275Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Dec 13 13:32:24.127524 containerd[1480]: time="2024-12-13T13:32:24.119881587Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:24.127524 containerd[1480]: time="2024-12-13T13:32:24.121093430Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:24.127524 containerd[1480]: time="2024-12-13T13:32:24.121110631Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:24.127524 containerd[1480]: time="2024-12-13T13:32:24.121324918Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:24.142886 systemd[1]: Started cri-containerd-d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573.scope - libcontainer container d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573. Dec 13 13:32:24.151021 kubelet[2825]: E1213 13:32:24.150987 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.151021 kubelet[2825]: W1213 13:32:24.151030 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.151182 kubelet[2825]: E1213 13:32:24.151053 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.151839 kubelet[2825]: E1213 13:32:24.151325 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.151839 kubelet[2825]: W1213 13:32:24.151337 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.151839 kubelet[2825]: E1213 13:32:24.151356 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.151839 kubelet[2825]: E1213 13:32:24.151670 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.151839 kubelet[2825]: W1213 13:32:24.151683 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.152007 kubelet[2825]: E1213 13:32:24.151849 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.152007 kubelet[2825]: E1213 13:32:24.151898 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.152007 kubelet[2825]: W1213 13:32:24.151904 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.152007 kubelet[2825]: E1213 13:32:24.151930 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.152385 kubelet[2825]: E1213 13:32:24.152145 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.152385 kubelet[2825]: W1213 13:32:24.152154 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.152385 kubelet[2825]: E1213 13:32:24.152173 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.152489 kubelet[2825]: E1213 13:32:24.152468 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.152489 kubelet[2825]: W1213 13:32:24.152483 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.152608 kubelet[2825]: E1213 13:32:24.152521 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.152752 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.153522 kubelet[2825]: W1213 13:32:24.152770 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.152796 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.153012 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.153522 kubelet[2825]: W1213 13:32:24.153021 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.153038 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.153231 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.153522 kubelet[2825]: W1213 13:32:24.153239 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.153250 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.153522 kubelet[2825]: E1213 13:32:24.153421 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.153804 kubelet[2825]: W1213 13:32:24.153430 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.153804 kubelet[2825]: E1213 13:32:24.153446 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.153804 kubelet[2825]: E1213 13:32:24.153662 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.153804 kubelet[2825]: W1213 13:32:24.153682 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.153804 kubelet[2825]: E1213 13:32:24.153699 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.153957 kubelet[2825]: E1213 13:32:24.153929 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.153957 kubelet[2825]: W1213 13:32:24.153947 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.154017 kubelet[2825]: E1213 13:32:24.153967 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.154211 kubelet[2825]: E1213 13:32:24.154188 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.154211 kubelet[2825]: W1213 13:32:24.154202 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.154274 kubelet[2825]: E1213 13:32:24.154220 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.154407 kubelet[2825]: E1213 13:32:24.154387 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.154407 kubelet[2825]: W1213 13:32:24.154399 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.154453 kubelet[2825]: E1213 13:32:24.154423 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.155520 kubelet[2825]: E1213 13:32:24.154686 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.155520 kubelet[2825]: W1213 13:32:24.154701 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.155520 kubelet[2825]: E1213 13:32:24.154736 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.155520 kubelet[2825]: E1213 13:32:24.154937 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.155520 kubelet[2825]: W1213 13:32:24.154945 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.155520 kubelet[2825]: E1213 13:32:24.155024 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.155520 kubelet[2825]: E1213 13:32:24.155267 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.155520 kubelet[2825]: W1213 13:32:24.155277 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.155520 kubelet[2825]: E1213 13:32:24.155381 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.155920 kubelet[2825]: E1213 13:32:24.155630 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.155920 kubelet[2825]: W1213 13:32:24.155642 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.155920 kubelet[2825]: E1213 13:32:24.155805 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.155991 kubelet[2825]: E1213 13:32:24.155961 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.155991 kubelet[2825]: W1213 13:32:24.155968 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.156033 kubelet[2825]: E1213 13:32:24.156008 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.156515 kubelet[2825]: E1213 13:32:24.156212 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.156515 kubelet[2825]: W1213 13:32:24.156229 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.156515 kubelet[2825]: E1213 13:32:24.156245 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.156643 kubelet[2825]: E1213 13:32:24.156629 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.156643 kubelet[2825]: W1213 13:32:24.156639 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.156693 kubelet[2825]: E1213 13:32:24.156655 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.156907 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.158484 kubelet[2825]: W1213 13:32:24.156923 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.156958 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.157261 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.158484 kubelet[2825]: W1213 13:32:24.157274 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.157324 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.157946 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.158484 kubelet[2825]: W1213 13:32:24.157960 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.157996 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.158484 kubelet[2825]: E1213 13:32:24.158319 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.159046 kubelet[2825]: W1213 13:32:24.158329 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.159046 kubelet[2825]: E1213 13:32:24.158342 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.181931 kubelet[2825]: E1213 13:32:24.181872 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:24.181931 kubelet[2825]: W1213 13:32:24.181903 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:24.181931 kubelet[2825]: E1213 13:32:24.181927 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:24.199762 containerd[1480]: time="2024-12-13T13:32:24.199712750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rsmpl,Uid:de3e5be1-d7f6-4eb0-b955-783f5b94bf29,Namespace:calico-system,Attempt:0,} returns sandbox id \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\"" Dec 13 13:32:25.683557 kubelet[2825]: E1213 13:32:25.681563 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:25.880759 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount960625873.mount: Deactivated successfully. Dec 13 13:32:26.934918 containerd[1480]: time="2024-12-13T13:32:26.933697074Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:26.936837 containerd[1480]: time="2024-12-13T13:32:26.936777144Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29231308" Dec 13 13:32:26.938720 containerd[1480]: time="2024-12-13T13:32:26.938676333Z" level=info msg="ImageCreate event name:\"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:26.942547 containerd[1480]: time="2024-12-13T13:32:26.942469309Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:26.944311 containerd[1480]: time="2024-12-13T13:32:26.943976483Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"29231162\" in 2.851862005s" Dec 13 13:32:26.944311 containerd[1480]: time="2024-12-13T13:32:26.944114808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\"" Dec 13 13:32:26.947131 containerd[1480]: time="2024-12-13T13:32:26.945739787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Dec 13 13:32:26.964780 containerd[1480]: time="2024-12-13T13:32:26.964734870Z" level=info msg="CreateContainer within sandbox \"bed24f411b123d4904d4ced5e386d52e2fefcb5a4b7400110270a4f96acb4c4b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 13 13:32:26.988486 containerd[1480]: time="2024-12-13T13:32:26.988425922Z" level=info msg="CreateContainer within sandbox \"bed24f411b123d4904d4ced5e386d52e2fefcb5a4b7400110270a4f96acb4c4b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d14cdc865aac6ae71d26ae6d7fa5d15355532254f5be4029de6ac798715a12eb\"" Dec 13 13:32:26.990482 containerd[1480]: time="2024-12-13T13:32:26.989274193Z" level=info msg="StartContainer for \"d14cdc865aac6ae71d26ae6d7fa5d15355532254f5be4029de6ac798715a12eb\"" Dec 13 13:32:27.027929 systemd[1]: Started cri-containerd-d14cdc865aac6ae71d26ae6d7fa5d15355532254f5be4029de6ac798715a12eb.scope - libcontainer container d14cdc865aac6ae71d26ae6d7fa5d15355532254f5be4029de6ac798715a12eb. Dec 13 13:32:27.077907 containerd[1480]: time="2024-12-13T13:32:27.077826272Z" level=info msg="StartContainer for \"d14cdc865aac6ae71d26ae6d7fa5d15355532254f5be4029de6ac798715a12eb\" returns successfully" Dec 13 13:32:27.682355 kubelet[2825]: E1213 13:32:27.682309 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:27.847673 kubelet[2825]: E1213 13:32:27.847402 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.847673 kubelet[2825]: W1213 13:32:27.847442 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.847673 kubelet[2825]: E1213 13:32:27.847467 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.848202 kubelet[2825]: E1213 13:32:27.848045 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.848202 kubelet[2825]: W1213 13:32:27.848061 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.848202 kubelet[2825]: E1213 13:32:27.848078 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.848736 kubelet[2825]: E1213 13:32:27.848546 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.848736 kubelet[2825]: W1213 13:32:27.848560 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.848736 kubelet[2825]: E1213 13:32:27.848577 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.849722 kubelet[2825]: E1213 13:32:27.849669 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.850110 kubelet[2825]: W1213 13:32:27.849966 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.850110 kubelet[2825]: E1213 13:32:27.850166 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.851414 kubelet[2825]: E1213 13:32:27.851088 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.851414 kubelet[2825]: W1213 13:32:27.851149 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.851414 kubelet[2825]: E1213 13:32:27.851195 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.854984 kubelet[2825]: I1213 13:32:27.853775 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-94754b74d-bdxw7" podStartSLOduration=1.999726682 podStartE2EDuration="4.853732644s" podCreationTimestamp="2024-12-13 13:32:23 +0000 UTC" firstStartedPulling="2024-12-13 13:32:24.090813632 +0000 UTC m=+24.568120894" lastFinishedPulling="2024-12-13 13:32:26.944819594 +0000 UTC m=+27.422126856" observedRunningTime="2024-12-13 13:32:27.85333843 +0000 UTC m=+28.330645692" watchObservedRunningTime="2024-12-13 13:32:27.853732644 +0000 UTC m=+28.331039906" Dec 13 13:32:27.854984 kubelet[2825]: E1213 13:32:27.853816 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.854984 kubelet[2825]: W1213 13:32:27.854716 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.854984 kubelet[2825]: E1213 13:32:27.854781 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.855519 kubelet[2825]: E1213 13:32:27.855470 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.855519 kubelet[2825]: W1213 13:32:27.855492 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.855604 kubelet[2825]: E1213 13:32:27.855538 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.855901 kubelet[2825]: E1213 13:32:27.855872 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.855901 kubelet[2825]: W1213 13:32:27.855890 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.855979 kubelet[2825]: E1213 13:32:27.855909 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.856232 kubelet[2825]: E1213 13:32:27.856152 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.856232 kubelet[2825]: W1213 13:32:27.856210 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.856232 kubelet[2825]: E1213 13:32:27.856232 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.857860 kubelet[2825]: E1213 13:32:27.856445 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.857860 kubelet[2825]: W1213 13:32:27.856460 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.857860 kubelet[2825]: E1213 13:32:27.856475 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.857860 kubelet[2825]: E1213 13:32:27.857863 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.858362 kubelet[2825]: W1213 13:32:27.857881 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.858362 kubelet[2825]: E1213 13:32:27.857923 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.858600 kubelet[2825]: E1213 13:32:27.858560 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.858704 kubelet[2825]: W1213 13:32:27.858673 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.858737 kubelet[2825]: E1213 13:32:27.858708 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.859403 kubelet[2825]: E1213 13:32:27.859382 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.859403 kubelet[2825]: W1213 13:32:27.859399 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.859656 kubelet[2825]: E1213 13:32:27.859416 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.859969 kubelet[2825]: E1213 13:32:27.859945 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.860051 kubelet[2825]: W1213 13:32:27.859961 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.860051 kubelet[2825]: E1213 13:32:27.860012 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.860300 kubelet[2825]: E1213 13:32:27.860255 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.860300 kubelet[2825]: W1213 13:32:27.860263 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.860300 kubelet[2825]: E1213 13:32:27.860291 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.880513 kubelet[2825]: E1213 13:32:27.880197 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.880513 kubelet[2825]: W1213 13:32:27.880318 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.880513 kubelet[2825]: E1213 13:32:27.880359 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.881014 kubelet[2825]: E1213 13:32:27.880906 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.881014 kubelet[2825]: W1213 13:32:27.880922 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.881014 kubelet[2825]: E1213 13:32:27.880947 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.881615 kubelet[2825]: E1213 13:32:27.881591 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.881615 kubelet[2825]: W1213 13:32:27.881611 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.881745 kubelet[2825]: E1213 13:32:27.881652 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.881909 kubelet[2825]: E1213 13:32:27.881900 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.881946 kubelet[2825]: W1213 13:32:27.881910 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.881946 kubelet[2825]: E1213 13:32:27.881924 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.882739 kubelet[2825]: E1213 13:32:27.882630 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.882739 kubelet[2825]: W1213 13:32:27.882655 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.882843 kubelet[2825]: E1213 13:32:27.882755 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.883372 kubelet[2825]: E1213 13:32:27.883030 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.883372 kubelet[2825]: W1213 13:32:27.883055 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.883372 kubelet[2825]: E1213 13:32:27.883166 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.883372 kubelet[2825]: E1213 13:32:27.883227 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.883372 kubelet[2825]: W1213 13:32:27.883235 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.883372 kubelet[2825]: E1213 13:32:27.883254 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.884616 kubelet[2825]: E1213 13:32:27.883779 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.884616 kubelet[2825]: W1213 13:32:27.883818 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.884616 kubelet[2825]: E1213 13:32:27.884145 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.885461 kubelet[2825]: E1213 13:32:27.885427 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.885461 kubelet[2825]: W1213 13:32:27.885449 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.885582 kubelet[2825]: E1213 13:32:27.885471 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.885864 kubelet[2825]: E1213 13:32:27.885848 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.885864 kubelet[2825]: W1213 13:32:27.885862 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.886694 kubelet[2825]: E1213 13:32:27.886075 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.886694 kubelet[2825]: W1213 13:32:27.886084 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.886694 kubelet[2825]: E1213 13:32:27.886315 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.886694 kubelet[2825]: W1213 13:32:27.886344 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.886694 kubelet[2825]: E1213 13:32:27.886388 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.886694 kubelet[2825]: E1213 13:32:27.886447 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.886819 kubelet[2825]: E1213 13:32:27.886801 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.886819 kubelet[2825]: W1213 13:32:27.886811 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.886858 kubelet[2825]: E1213 13:32:27.886827 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.887541 kubelet[2825]: E1213 13:32:27.886891 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.888064 kubelet[2825]: E1213 13:32:27.888019 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.888064 kubelet[2825]: W1213 13:32:27.888040 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.888226 kubelet[2825]: E1213 13:32:27.888163 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.888768 kubelet[2825]: E1213 13:32:27.888569 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.888768 kubelet[2825]: W1213 13:32:27.888582 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.888768 kubelet[2825]: E1213 13:32:27.888602 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.889065 kubelet[2825]: E1213 13:32:27.889051 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.889351 kubelet[2825]: W1213 13:32:27.889117 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.889351 kubelet[2825]: E1213 13:32:27.889149 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.889752 kubelet[2825]: E1213 13:32:27.889736 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.889752 kubelet[2825]: W1213 13:32:27.889750 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.889895 kubelet[2825]: E1213 13:32:27.889771 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:27.890070 kubelet[2825]: E1213 13:32:27.890056 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:27.890101 kubelet[2825]: W1213 13:32:27.890069 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:27.890101 kubelet[2825]: E1213 13:32:27.890083 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.841171 kubelet[2825]: I1213 13:32:28.840005 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 13 13:32:28.869336 kubelet[2825]: E1213 13:32:28.869197 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.869336 kubelet[2825]: W1213 13:32:28.869222 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.869336 kubelet[2825]: E1213 13:32:28.869245 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.869712 kubelet[2825]: E1213 13:32:28.869694 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.869887 kubelet[2825]: W1213 13:32:28.869766 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.869887 kubelet[2825]: E1213 13:32:28.869788 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.870027 kubelet[2825]: E1213 13:32:28.870015 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.870098 kubelet[2825]: W1213 13:32:28.870077 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.870169 kubelet[2825]: E1213 13:32:28.870159 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.870474 kubelet[2825]: E1213 13:32:28.870456 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.870733 kubelet[2825]: W1213 13:32:28.870589 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.870733 kubelet[2825]: E1213 13:32:28.870610 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.870885 kubelet[2825]: E1213 13:32:28.870872 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.870937 kubelet[2825]: W1213 13:32:28.870926 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.870999 kubelet[2825]: E1213 13:32:28.870989 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.871314 kubelet[2825]: E1213 13:32:28.871292 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.871567 kubelet[2825]: W1213 13:32:28.871407 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.871567 kubelet[2825]: E1213 13:32:28.871433 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.871936 kubelet[2825]: E1213 13:32:28.871805 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.871936 kubelet[2825]: W1213 13:32:28.871820 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.871936 kubelet[2825]: E1213 13:32:28.871835 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.872339 kubelet[2825]: E1213 13:32:28.872207 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.872339 kubelet[2825]: W1213 13:32:28.872221 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.872339 kubelet[2825]: E1213 13:32:28.872236 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.872616 kubelet[2825]: E1213 13:32:28.872598 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.872873 kubelet[2825]: W1213 13:32:28.872731 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.872873 kubelet[2825]: E1213 13:32:28.872760 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.876175 kubelet[2825]: E1213 13:32:28.875738 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.876175 kubelet[2825]: W1213 13:32:28.875770 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.876175 kubelet[2825]: E1213 13:32:28.875804 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.877004 kubelet[2825]: E1213 13:32:28.876970 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.877343 kubelet[2825]: W1213 13:32:28.877137 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.877343 kubelet[2825]: E1213 13:32:28.877176 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.877970 kubelet[2825]: E1213 13:32:28.877855 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.877970 kubelet[2825]: W1213 13:32:28.877899 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.877970 kubelet[2825]: E1213 13:32:28.877922 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.878850 kubelet[2825]: E1213 13:32:28.878684 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.878850 kubelet[2825]: W1213 13:32:28.878705 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.878850 kubelet[2825]: E1213 13:32:28.878728 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.879452 kubelet[2825]: E1213 13:32:28.879171 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.879452 kubelet[2825]: W1213 13:32:28.879187 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.879452 kubelet[2825]: E1213 13:32:28.879210 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.879806 kubelet[2825]: E1213 13:32:28.879784 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.879907 kubelet[2825]: W1213 13:32:28.879889 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.880085 kubelet[2825]: E1213 13:32:28.879998 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.891732 kubelet[2825]: E1213 13:32:28.891012 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.891732 kubelet[2825]: W1213 13:32:28.891035 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.891732 kubelet[2825]: E1213 13:32:28.891061 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.891732 kubelet[2825]: E1213 13:32:28.891272 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.891732 kubelet[2825]: W1213 13:32:28.891283 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.891732 kubelet[2825]: E1213 13:32:28.891297 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.891732 kubelet[2825]: E1213 13:32:28.891471 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.891732 kubelet[2825]: W1213 13:32:28.891479 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.891732 kubelet[2825]: E1213 13:32:28.891513 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.894758 kubelet[2825]: E1213 13:32:28.894017 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.894758 kubelet[2825]: W1213 13:32:28.894043 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.894758 kubelet[2825]: E1213 13:32:28.894074 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.894758 kubelet[2825]: E1213 13:32:28.894327 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.894758 kubelet[2825]: W1213 13:32:28.894336 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.894758 kubelet[2825]: E1213 13:32:28.894421 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.894758 kubelet[2825]: E1213 13:32:28.894526 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.894758 kubelet[2825]: W1213 13:32:28.894534 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.894758 kubelet[2825]: E1213 13:32:28.894647 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.895422 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898192 kubelet[2825]: W1213 13:32:28.895435 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.895635 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.895690 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898192 kubelet[2825]: W1213 13:32:28.895696 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.895708 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.895876 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898192 kubelet[2825]: W1213 13:32:28.895883 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.895893 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898192 kubelet[2825]: E1213 13:32:28.896210 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898422 kubelet[2825]: W1213 13:32:28.896218 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898422 kubelet[2825]: E1213 13:32:28.896229 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898422 kubelet[2825]: E1213 13:32:28.896373 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898422 kubelet[2825]: W1213 13:32:28.896381 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898422 kubelet[2825]: E1213 13:32:28.896391 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898422 kubelet[2825]: E1213 13:32:28.896513 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898422 kubelet[2825]: W1213 13:32:28.896519 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898422 kubelet[2825]: E1213 13:32:28.896531 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898422 kubelet[2825]: E1213 13:32:28.896770 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898422 kubelet[2825]: W1213 13:32:28.896778 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.896789 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.897157 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898641 kubelet[2825]: W1213 13:32:28.897166 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.897179 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.897311 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898641 kubelet[2825]: W1213 13:32:28.897318 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.897327 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.897436 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898641 kubelet[2825]: W1213 13:32:28.897444 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898641 kubelet[2825]: E1213 13:32:28.897454 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.898964 kubelet[2825]: E1213 13:32:28.897628 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.898964 kubelet[2825]: W1213 13:32:28.897637 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.898964 kubelet[2825]: E1213 13:32:28.897651 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:28.900822 kubelet[2825]: E1213 13:32:28.900719 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 13 13:32:28.900822 kubelet[2825]: W1213 13:32:28.900741 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 13 13:32:28.900822 kubelet[2825]: E1213 13:32:28.900761 2825 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 13 13:32:29.032014 containerd[1480]: time="2024-12-13T13:32:29.031223675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:29.034071 containerd[1480]: time="2024-12-13T13:32:29.034019377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5117811" Dec 13 13:32:29.035384 containerd[1480]: time="2024-12-13T13:32:29.035350145Z" level=info msg="ImageCreate event name:\"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:29.038809 containerd[1480]: time="2024-12-13T13:32:29.038724668Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:29.039533 containerd[1480]: time="2024-12-13T13:32:29.039481336Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6487425\" in 2.093682788s" Dec 13 13:32:29.039745 containerd[1480]: time="2024-12-13T13:32:29.039632102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\"" Dec 13 13:32:29.045063 containerd[1480]: time="2024-12-13T13:32:29.045006738Z" level=info msg="CreateContainer within sandbox \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 13 13:32:29.068818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2957441603.mount: Deactivated successfully. Dec 13 13:32:29.070773 containerd[1480]: time="2024-12-13T13:32:29.070724916Z" level=info msg="CreateContainer within sandbox \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133\"" Dec 13 13:32:29.071803 containerd[1480]: time="2024-12-13T13:32:29.071579787Z" level=info msg="StartContainer for \"445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133\"" Dec 13 13:32:29.107823 systemd[1]: Started cri-containerd-445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133.scope - libcontainer container 445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133. Dec 13 13:32:29.146220 containerd[1480]: time="2024-12-13T13:32:29.146138708Z" level=info msg="StartContainer for \"445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133\" returns successfully" Dec 13 13:32:29.169090 systemd[1]: cri-containerd-445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133.scope: Deactivated successfully. Dec 13 13:32:29.207512 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133-rootfs.mount: Deactivated successfully. Dec 13 13:32:29.344419 containerd[1480]: time="2024-12-13T13:32:29.344199495Z" level=info msg="shim disconnected" id=445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133 namespace=k8s.io Dec 13 13:32:29.344419 containerd[1480]: time="2024-12-13T13:32:29.344293458Z" level=warning msg="cleaning up after shim disconnected" id=445eaed6c0ac0be9a37ca95e6c8f2d475c3e1bc082e1c2003b78804f465a1133 namespace=k8s.io Dec 13 13:32:29.344419 containerd[1480]: time="2024-12-13T13:32:29.344305659Z" level=info msg="cleaning up dead shim" namespace=k8s.io Dec 13 13:32:29.682561 kubelet[2825]: E1213 13:32:29.681667 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:29.854917 containerd[1480]: time="2024-12-13T13:32:29.851616650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Dec 13 13:32:30.205376 kubelet[2825]: I1213 13:32:30.205212 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 13 13:32:31.681812 kubelet[2825]: E1213 13:32:31.681696 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:32.657134 containerd[1480]: time="2024-12-13T13:32:32.657044677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:32.659067 containerd[1480]: time="2024-12-13T13:32:32.658992389Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=89703123" Dec 13 13:32:32.661276 containerd[1480]: time="2024-12-13T13:32:32.661186590Z" level=info msg="ImageCreate event name:\"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:32.664051 containerd[1480]: time="2024-12-13T13:32:32.663955852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:32.667951 containerd[1480]: time="2024-12-13T13:32:32.666065890Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"91072777\" in 2.814406559s" Dec 13 13:32:32.667951 containerd[1480]: time="2024-12-13T13:32:32.666126973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\"" Dec 13 13:32:32.671014 containerd[1480]: time="2024-12-13T13:32:32.670787825Z" level=info msg="CreateContainer within sandbox \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 13 13:32:32.690370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3461598548.mount: Deactivated successfully. Dec 13 13:32:32.696150 containerd[1480]: time="2024-12-13T13:32:32.695468457Z" level=info msg="CreateContainer within sandbox \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02\"" Dec 13 13:32:32.696645 containerd[1480]: time="2024-12-13T13:32:32.696533296Z" level=info msg="StartContainer for \"a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02\"" Dec 13 13:32:32.732277 systemd[1]: run-containerd-runc-k8s.io-a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02-runc.ujN8ct.mount: Deactivated successfully. Dec 13 13:32:32.738714 systemd[1]: Started cri-containerd-a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02.scope - libcontainer container a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02. Dec 13 13:32:32.786591 containerd[1480]: time="2024-12-13T13:32:32.786449339Z" level=info msg="StartContainer for \"a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02\" returns successfully" Dec 13 13:32:33.354524 containerd[1480]: time="2024-12-13T13:32:33.354419979Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 13 13:32:33.358195 systemd[1]: cri-containerd-a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02.scope: Deactivated successfully. Dec 13 13:32:33.369404 kubelet[2825]: I1213 13:32:33.368586 2825 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Dec 13 13:32:33.408917 kubelet[2825]: I1213 13:32:33.408866 2825 topology_manager.go:215] "Topology Admit Handler" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" podNamespace="kube-system" podName="coredns-76f75df574-hgwlz" Dec 13 13:32:33.421697 kubelet[2825]: I1213 13:32:33.420640 2825 topology_manager.go:215] "Topology Admit Handler" podUID="bcc560db-f238-49e4-9766-c00316d8e479" podNamespace="calico-system" podName="calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:33.444794 kubelet[2825]: I1213 13:32:33.426491 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gmk\" (UniqueName: \"kubernetes.io/projected/93ef4867-9f5c-40e8-b3d6-6a06506fddf9-kube-api-access-55gmk\") pod \"coredns-76f75df574-hgwlz\" (UID: \"93ef4867-9f5c-40e8-b3d6-6a06506fddf9\") " pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:33.444794 kubelet[2825]: I1213 13:32:33.426543 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93ef4867-9f5c-40e8-b3d6-6a06506fddf9-config-volume\") pod \"coredns-76f75df574-hgwlz\" (UID: \"93ef4867-9f5c-40e8-b3d6-6a06506fddf9\") " pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:33.444794 kubelet[2825]: I1213 13:32:33.436155 2825 topology_manager.go:215] "Topology Admit Handler" podUID="74a12de7-657f-4aae-821f-e260248f542a" podNamespace="calico-apiserver" podName="calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:33.444794 kubelet[2825]: I1213 13:32:33.436906 2825 topology_manager.go:215] "Topology Admit Handler" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" podNamespace="kube-system" podName="coredns-76f75df574-xdxpd" Dec 13 13:32:33.444794 kubelet[2825]: I1213 13:32:33.439615 2825 topology_manager.go:215] "Topology Admit Handler" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" podNamespace="calico-apiserver" podName="calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:33.425157 systemd[1]: Created slice kubepods-burstable-pod93ef4867_9f5c_40e8_b3d6_6a06506fddf9.slice - libcontainer container kubepods-burstable-pod93ef4867_9f5c_40e8_b3d6_6a06506fddf9.slice. Dec 13 13:32:33.449195 systemd[1]: Created slice kubepods-besteffort-podbcc560db_f238_49e4_9766_c00316d8e479.slice - libcontainer container kubepods-besteffort-podbcc560db_f238_49e4_9766_c00316d8e479.slice. Dec 13 13:32:33.457469 systemd[1]: Created slice kubepods-besteffort-pod74a12de7_657f_4aae_821f_e260248f542a.slice - libcontainer container kubepods-besteffort-pod74a12de7_657f_4aae_821f_e260248f542a.slice. Dec 13 13:32:33.471832 systemd[1]: Created slice kubepods-besteffort-pod587dd10b_ef35_45f6_8cda_437a5ce24419.slice - libcontainer container kubepods-besteffort-pod587dd10b_ef35_45f6_8cda_437a5ce24419.slice. Dec 13 13:32:33.480834 systemd[1]: Created slice kubepods-burstable-pod8d7d06e0_d386_4db3_9635_acc914ab1f58.slice - libcontainer container kubepods-burstable-pod8d7d06e0_d386_4db3_9635_acc914ab1f58.slice. Dec 13 13:32:33.528673 kubelet[2825]: I1213 13:32:33.528386 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mbj\" (UniqueName: \"kubernetes.io/projected/587dd10b-ef35-45f6-8cda-437a5ce24419-kube-api-access-d2mbj\") pod \"calico-apiserver-7db64dc7d4-49csf\" (UID: \"587dd10b-ef35-45f6-8cda-437a5ce24419\") " pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:33.528673 kubelet[2825]: I1213 13:32:33.528454 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgtld\" (UniqueName: \"kubernetes.io/projected/8d7d06e0-d386-4db3-9635-acc914ab1f58-kube-api-access-cgtld\") pod \"coredns-76f75df574-xdxpd\" (UID: \"8d7d06e0-d386-4db3-9635-acc914ab1f58\") " pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:33.528673 kubelet[2825]: I1213 13:32:33.528486 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/587dd10b-ef35-45f6-8cda-437a5ce24419-calico-apiserver-certs\") pod \"calico-apiserver-7db64dc7d4-49csf\" (UID: \"587dd10b-ef35-45f6-8cda-437a5ce24419\") " pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:33.528673 kubelet[2825]: I1213 13:32:33.528532 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k95wm\" (UniqueName: \"kubernetes.io/projected/74a12de7-657f-4aae-821f-e260248f542a-kube-api-access-k95wm\") pod \"calico-apiserver-7db64dc7d4-8w5p4\" (UID: \"74a12de7-657f-4aae-821f-e260248f542a\") " pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:33.528673 kubelet[2825]: I1213 13:32:33.528574 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d7d06e0-d386-4db3-9635-acc914ab1f58-config-volume\") pod \"coredns-76f75df574-xdxpd\" (UID: \"8d7d06e0-d386-4db3-9635-acc914ab1f58\") " pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:33.528934 kubelet[2825]: I1213 13:32:33.528602 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bcc560db-f238-49e4-9766-c00316d8e479-tigera-ca-bundle\") pod \"calico-kube-controllers-568896bf68-gs24f\" (UID: \"bcc560db-f238-49e4-9766-c00316d8e479\") " pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:33.528934 kubelet[2825]: I1213 13:32:33.528651 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/74a12de7-657f-4aae-821f-e260248f542a-calico-apiserver-certs\") pod \"calico-apiserver-7db64dc7d4-8w5p4\" (UID: \"74a12de7-657f-4aae-821f-e260248f542a\") " pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:33.530177 kubelet[2825]: I1213 13:32:33.530097 2825 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m29x2\" (UniqueName: \"kubernetes.io/projected/bcc560db-f238-49e4-9766-c00316d8e479-kube-api-access-m29x2\") pod \"calico-kube-controllers-568896bf68-gs24f\" (UID: \"bcc560db-f238-49e4-9766-c00316d8e479\") " pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:33.557573 containerd[1480]: time="2024-12-13T13:32:33.557337587Z" level=info msg="shim disconnected" id=a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02 namespace=k8s.io Dec 13 13:32:33.557573 containerd[1480]: time="2024-12-13T13:32:33.557407070Z" level=warning msg="cleaning up after shim disconnected" id=a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02 namespace=k8s.io Dec 13 13:32:33.557573 containerd[1480]: time="2024-12-13T13:32:33.557420150Z" level=info msg="cleaning up dead shim" namespace=k8s.io Dec 13 13:32:33.694820 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a3dd4adacfda77ee8299f909becd68d2cda65425f595b8bbdeb4d0e4ede21e02-rootfs.mount: Deactivated successfully. Dec 13 13:32:33.703698 systemd[1]: Created slice kubepods-besteffort-poda99d7c4c_ae69_4d70_a627_2ff0fceee5d5.slice - libcontainer container kubepods-besteffort-poda99d7c4c_ae69_4d70_a627_2ff0fceee5d5.slice. Dec 13 13:32:33.707013 containerd[1480]: time="2024-12-13T13:32:33.706607645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:0,}" Dec 13 13:32:33.750283 containerd[1480]: time="2024-12-13T13:32:33.749902411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:0,}" Dec 13 13:32:33.761256 containerd[1480]: time="2024-12-13T13:32:33.760930020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:0,}" Dec 13 13:32:33.764838 containerd[1480]: time="2024-12-13T13:32:33.764516273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:0,}" Dec 13 13:32:33.782459 containerd[1480]: time="2024-12-13T13:32:33.782405297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:0,}" Dec 13 13:32:33.787547 containerd[1480]: time="2024-12-13T13:32:33.786966786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:0,}" Dec 13 13:32:33.856754 containerd[1480]: time="2024-12-13T13:32:33.856515327Z" level=error msg="Failed to destroy network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.857596 containerd[1480]: time="2024-12-13T13:32:33.856999345Z" level=error msg="encountered an error cleaning up failed sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.857701 containerd[1480]: time="2024-12-13T13:32:33.857611807Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.858243 kubelet[2825]: E1213 13:32:33.858104 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.858243 kubelet[2825]: E1213 13:32:33.858183 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:33.858243 kubelet[2825]: E1213 13:32:33.858207 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:33.858735 kubelet[2825]: E1213 13:32:33.858605 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:33.877709 kubelet[2825]: I1213 13:32:33.877242 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b" Dec 13 13:32:33.879363 containerd[1480]: time="2024-12-13T13:32:33.879145846Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:33.879903 containerd[1480]: time="2024-12-13T13:32:33.879759069Z" level=info msg="Ensure that sandbox c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b in task-service has been cleanup successfully" Dec 13 13:32:33.880947 containerd[1480]: time="2024-12-13T13:32:33.880476576Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:33.881115 containerd[1480]: time="2024-12-13T13:32:33.881041237Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:33.881870 containerd[1480]: time="2024-12-13T13:32:33.881394490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Dec 13 13:32:33.883917 containerd[1480]: time="2024-12-13T13:32:33.883795179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:1,}" Dec 13 13:32:33.960094 containerd[1480]: time="2024-12-13T13:32:33.959870681Z" level=error msg="Failed to destroy network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.962053 containerd[1480]: time="2024-12-13T13:32:33.961872995Z" level=error msg="encountered an error cleaning up failed sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.962053 containerd[1480]: time="2024-12-13T13:32:33.961995200Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.962707 kubelet[2825]: E1213 13:32:33.962324 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:33.962707 kubelet[2825]: E1213 13:32:33.962379 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:33.962707 kubelet[2825]: E1213 13:32:33.962402 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:33.962898 kubelet[2825]: E1213 13:32:33.962453 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-hgwlz" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" Dec 13 13:32:34.070910 containerd[1480]: time="2024-12-13T13:32:34.070508155Z" level=error msg="Failed to destroy network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.071828 containerd[1480]: time="2024-12-13T13:32:34.071649958Z" level=error msg="encountered an error cleaning up failed sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.071828 containerd[1480]: time="2024-12-13T13:32:34.071736361Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.072603 kubelet[2825]: E1213 13:32:34.072150 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.072603 kubelet[2825]: E1213 13:32:34.072205 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:34.072603 kubelet[2825]: E1213 13:32:34.072228 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:34.072754 kubelet[2825]: E1213 13:32:34.072285 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podUID="bcc560db-f238-49e4-9766-c00316d8e479" Dec 13 13:32:34.082902 containerd[1480]: time="2024-12-13T13:32:34.082840495Z" level=error msg="Failed to destroy network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.084237 containerd[1480]: time="2024-12-13T13:32:34.084098541Z" level=error msg="encountered an error cleaning up failed sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.084237 containerd[1480]: time="2024-12-13T13:32:34.084185305Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.084580 kubelet[2825]: E1213 13:32:34.084416 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.084580 kubelet[2825]: E1213 13:32:34.084468 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:34.084580 kubelet[2825]: E1213 13:32:34.084525 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:34.084707 kubelet[2825]: E1213 13:32:34.084583 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:34.094577 containerd[1480]: time="2024-12-13T13:32:34.093449970Z" level=error msg="Failed to destroy network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.094577 containerd[1480]: time="2024-12-13T13:32:34.093892626Z" level=error msg="encountered an error cleaning up failed sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.094577 containerd[1480]: time="2024-12-13T13:32:34.093954148Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.095065 kubelet[2825]: E1213 13:32:34.095026 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.095121 kubelet[2825]: E1213 13:32:34.095090 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:34.095121 kubelet[2825]: E1213 13:32:34.095117 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:34.095208 kubelet[2825]: E1213 13:32:34.095188 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podUID="74a12de7-657f-4aae-821f-e260248f542a" Dec 13 13:32:34.101167 containerd[1480]: time="2024-12-13T13:32:34.099983093Z" level=error msg="Failed to destroy network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.101794 containerd[1480]: time="2024-12-13T13:32:34.101715958Z" level=error msg="encountered an error cleaning up failed sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.101885 containerd[1480]: time="2024-12-13T13:32:34.101855243Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.102334 kubelet[2825]: E1213 13:32:34.102122 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.102334 kubelet[2825]: E1213 13:32:34.102175 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:34.102334 kubelet[2825]: E1213 13:32:34.102196 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:34.102456 kubelet[2825]: E1213 13:32:34.102254 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" Dec 13 13:32:34.113575 containerd[1480]: time="2024-12-13T13:32:34.113507557Z" level=error msg="Failed to destroy network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.113998 containerd[1480]: time="2024-12-13T13:32:34.113967174Z" level=error msg="encountered an error cleaning up failed sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.114065 containerd[1480]: time="2024-12-13T13:32:34.114050457Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.114374 kubelet[2825]: E1213 13:32:34.114345 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:34.114438 kubelet[2825]: E1213 13:32:34.114408 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:34.114465 kubelet[2825]: E1213 13:32:34.114434 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:34.116349 kubelet[2825]: E1213 13:32:34.116270 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-xdxpd" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" Dec 13 13:32:34.692699 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c-shm.mount: Deactivated successfully. Dec 13 13:32:34.692936 systemd[1]: run-netns-cni\x2d3b868a1e\x2d82f0\x2dd5a3\x2d85ea\x2d1d97e9630565.mount: Deactivated successfully. Dec 13 13:32:34.693012 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b-shm.mount: Deactivated successfully. Dec 13 13:32:34.880736 kubelet[2825]: I1213 13:32:34.880676 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246" Dec 13 13:32:34.884310 containerd[1480]: time="2024-12-13T13:32:34.881677724Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:34.884310 containerd[1480]: time="2024-12-13T13:32:34.881896732Z" level=info msg="Ensure that sandbox 28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246 in task-service has been cleanup successfully" Dec 13 13:32:34.884091 systemd[1]: run-netns-cni\x2df2b8f237\x2db75e\x2de115\x2d7011\x2d49869f76f936.mount: Deactivated successfully. Dec 13 13:32:34.886686 containerd[1480]: time="2024-12-13T13:32:34.884969326Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:34.886686 containerd[1480]: time="2024-12-13T13:32:34.885010488Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:34.889349 containerd[1480]: time="2024-12-13T13:32:34.888784748Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:34.889672 containerd[1480]: time="2024-12-13T13:32:34.889507215Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:34.889672 containerd[1480]: time="2024-12-13T13:32:34.889523056Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:34.891337 kubelet[2825]: I1213 13:32:34.890674 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f" Dec 13 13:32:34.891466 containerd[1480]: time="2024-12-13T13:32:34.890939069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:2,}" Dec 13 13:32:34.892595 containerd[1480]: time="2024-12-13T13:32:34.892368242Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:34.894447 containerd[1480]: time="2024-12-13T13:32:34.893714132Z" level=info msg="Ensure that sandbox 4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f in task-service has been cleanup successfully" Dec 13 13:32:34.894685 containerd[1480]: time="2024-12-13T13:32:34.894659567Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:34.894780 containerd[1480]: time="2024-12-13T13:32:34.894766491Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:34.898082 containerd[1480]: time="2024-12-13T13:32:34.897225623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:1,}" Dec 13 13:32:34.898219 kubelet[2825]: I1213 13:32:34.897557 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1" Dec 13 13:32:34.898523 containerd[1480]: time="2024-12-13T13:32:34.898465069Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:34.899162 containerd[1480]: time="2024-12-13T13:32:34.898771520Z" level=info msg="Ensure that sandbox 4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1 in task-service has been cleanup successfully" Dec 13 13:32:34.900135 containerd[1480]: time="2024-12-13T13:32:34.900091769Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:34.900268 containerd[1480]: time="2024-12-13T13:32:34.900253655Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:34.901746 containerd[1480]: time="2024-12-13T13:32:34.901711070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:1,}" Dec 13 13:32:34.903655 kubelet[2825]: I1213 13:32:34.903504 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c" Dec 13 13:32:34.905642 containerd[1480]: time="2024-12-13T13:32:34.905277482Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:34.905642 containerd[1480]: time="2024-12-13T13:32:34.905468330Z" level=info msg="Ensure that sandbox 6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c in task-service has been cleanup successfully" Dec 13 13:32:34.906039 containerd[1480]: time="2024-12-13T13:32:34.905856944Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:34.906039 containerd[1480]: time="2024-12-13T13:32:34.905881265Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:34.905991 systemd[1]: run-netns-cni\x2d07c3f039\x2da8f8\x2d8a8c\x2d2581\x2d61c8d802f72f.mount: Deactivated successfully. Dec 13 13:32:34.907176 containerd[1480]: time="2024-12-13T13:32:34.906723056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:1,}" Dec 13 13:32:34.907292 kubelet[2825]: I1213 13:32:34.906985 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c" Dec 13 13:32:34.916712 containerd[1480]: time="2024-12-13T13:32:34.916077245Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:34.917458 systemd[1]: run-netns-cni\x2d6be6b3df\x2df362\x2d5104\x2df207\x2d21ddf1fbe2e8.mount: Deactivated successfully. Dec 13 13:32:34.917602 systemd[1]: run-netns-cni\x2d00da8b2a\x2d724d\x2df2c9\x2dfb29\x2d7f6a1047d7ef.mount: Deactivated successfully. Dec 13 13:32:34.921685 containerd[1480]: time="2024-12-13T13:32:34.920222919Z" level=info msg="Ensure that sandbox 234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c in task-service has been cleanup successfully" Dec 13 13:32:34.924999 kubelet[2825]: I1213 13:32:34.922807 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c" Dec 13 13:32:34.925153 containerd[1480]: time="2024-12-13T13:32:34.924257629Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:34.925153 containerd[1480]: time="2024-12-13T13:32:34.924573081Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:34.925153 containerd[1480]: time="2024-12-13T13:32:34.924759528Z" level=info msg="Ensure that sandbox 8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c in task-service has been cleanup successfully" Dec 13 13:32:34.925423 containerd[1480]: time="2024-12-13T13:32:34.925386631Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:34.925465 systemd[1]: run-netns-cni\x2da2923f97\x2ddda4\x2d57c6\x2d46bc\x2da4aac8de441a.mount: Deactivated successfully. Dec 13 13:32:34.926062 containerd[1480]: time="2024-12-13T13:32:34.925810527Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:34.926062 containerd[1480]: time="2024-12-13T13:32:34.925523596Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:34.929121 containerd[1480]: time="2024-12-13T13:32:34.927749919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:1,}" Dec 13 13:32:34.931299 containerd[1480]: time="2024-12-13T13:32:34.930893316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:1,}" Dec 13 13:32:35.067569 containerd[1480]: time="2024-12-13T13:32:35.067239443Z" level=error msg="Failed to destroy network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.069584 containerd[1480]: time="2024-12-13T13:32:35.069529369Z" level=error msg="encountered an error cleaning up failed sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.069960 containerd[1480]: time="2024-12-13T13:32:35.069614732Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.071364 kubelet[2825]: E1213 13:32:35.070209 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.071364 kubelet[2825]: E1213 13:32:35.070281 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:35.071364 kubelet[2825]: E1213 13:32:35.070306 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:35.071584 kubelet[2825]: E1213 13:32:35.070383 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podUID="74a12de7-657f-4aae-821f-e260248f542a" Dec 13 13:32:35.129936 containerd[1480]: time="2024-12-13T13:32:35.129577893Z" level=error msg="Failed to destroy network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.131164 containerd[1480]: time="2024-12-13T13:32:35.131107870Z" level=error msg="encountered an error cleaning up failed sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.131819 containerd[1480]: time="2024-12-13T13:32:35.131742974Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.133783 kubelet[2825]: E1213 13:32:35.133183 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.133783 kubelet[2825]: E1213 13:32:35.133249 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:35.133783 kubelet[2825]: E1213 13:32:35.133272 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:35.134018 kubelet[2825]: E1213 13:32:35.133337 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:35.197358 containerd[1480]: time="2024-12-13T13:32:35.197294224Z" level=error msg="Failed to destroy network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.199479 containerd[1480]: time="2024-12-13T13:32:35.199246937Z" level=error msg="encountered an error cleaning up failed sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.199479 containerd[1480]: time="2024-12-13T13:32:35.199340781Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.199755 kubelet[2825]: E1213 13:32:35.199625 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.199755 kubelet[2825]: E1213 13:32:35.199679 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:35.199755 kubelet[2825]: E1213 13:32:35.199699 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:35.200262 kubelet[2825]: E1213 13:32:35.199755 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-xdxpd" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" Dec 13 13:32:35.204944 containerd[1480]: time="2024-12-13T13:32:35.204560976Z" level=error msg="Failed to destroy network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.206282 containerd[1480]: time="2024-12-13T13:32:35.206008030Z" level=error msg="encountered an error cleaning up failed sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.207129 containerd[1480]: time="2024-12-13T13:32:35.206647094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.207351 kubelet[2825]: E1213 13:32:35.207313 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.207413 kubelet[2825]: E1213 13:32:35.207402 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:35.207441 kubelet[2825]: E1213 13:32:35.207424 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:35.208683 kubelet[2825]: E1213 13:32:35.208595 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-hgwlz" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" Dec 13 13:32:35.213649 containerd[1480]: time="2024-12-13T13:32:35.213561072Z" level=error msg="Failed to destroy network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.214039 containerd[1480]: time="2024-12-13T13:32:35.213991128Z" level=error msg="encountered an error cleaning up failed sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.214194 containerd[1480]: time="2024-12-13T13:32:35.214148574Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.214553 kubelet[2825]: E1213 13:32:35.214433 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.214749 kubelet[2825]: E1213 13:32:35.214694 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:35.215023 kubelet[2825]: E1213 13:32:35.214732 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:35.215023 kubelet[2825]: E1213 13:32:35.214961 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" Dec 13 13:32:35.218556 containerd[1480]: time="2024-12-13T13:32:35.218367092Z" level=error msg="Failed to destroy network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.219286 containerd[1480]: time="2024-12-13T13:32:35.219106359Z" level=error msg="encountered an error cleaning up failed sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.219286 containerd[1480]: time="2024-12-13T13:32:35.219196963Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.220062 kubelet[2825]: E1213 13:32:35.219881 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:35.220062 kubelet[2825]: E1213 13:32:35.219942 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:35.220062 kubelet[2825]: E1213 13:32:35.219964 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:35.220269 kubelet[2825]: E1213 13:32:35.220022 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podUID="bcc560db-f238-49e4-9766-c00316d8e479" Dec 13 13:32:35.701629 systemd[1]: run-netns-cni\x2dbee7ed6a\x2d4a59\x2dbcc0\x2d9697\x2d87a9f1c0ca25.mount: Deactivated successfully. Dec 13 13:32:35.927296 kubelet[2825]: I1213 13:32:35.927203 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6" Dec 13 13:32:35.931621 containerd[1480]: time="2024-12-13T13:32:35.928130740Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:35.931621 containerd[1480]: time="2024-12-13T13:32:35.928325508Z" level=info msg="Ensure that sandbox 84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6 in task-service has been cleanup successfully" Dec 13 13:32:35.931621 containerd[1480]: time="2024-12-13T13:32:35.928679201Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:35.931621 containerd[1480]: time="2024-12-13T13:32:35.928698802Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:35.931299 systemd[1]: run-netns-cni\x2d4109d0ec\x2d17f3\x2d5ead\x2da47e\x2dd2e8c011f06b.mount: Deactivated successfully. Dec 13 13:32:35.934293 containerd[1480]: time="2024-12-13T13:32:35.932926000Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:35.934293 containerd[1480]: time="2024-12-13T13:32:35.933067285Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:35.934293 containerd[1480]: time="2024-12-13T13:32:35.933077125Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:35.936300 containerd[1480]: time="2024-12-13T13:32:35.934342933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:2,}" Dec 13 13:32:35.936908 kubelet[2825]: I1213 13:32:35.936189 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094" Dec 13 13:32:35.940586 containerd[1480]: time="2024-12-13T13:32:35.938779938Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:35.940586 containerd[1480]: time="2024-12-13T13:32:35.939029308Z" level=info msg="Ensure that sandbox 3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094 in task-service has been cleanup successfully" Dec 13 13:32:35.944528 containerd[1480]: time="2024-12-13T13:32:35.943001696Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:35.944391 systemd[1]: run-netns-cni\x2d64fea983\x2d3595\x2d8cd3\x2d3885\x2d09d52bb5d0ec.mount: Deactivated successfully. Dec 13 13:32:35.945315 containerd[1480]: time="2024-12-13T13:32:35.943443233Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:35.949520 containerd[1480]: time="2024-12-13T13:32:35.949308812Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:35.949875 containerd[1480]: time="2024-12-13T13:32:35.949738748Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:35.949875 containerd[1480]: time="2024-12-13T13:32:35.949759189Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:35.951025 containerd[1480]: time="2024-12-13T13:32:35.950962034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:2,}" Dec 13 13:32:35.955922 kubelet[2825]: I1213 13:32:35.953747 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9" Dec 13 13:32:35.956266 containerd[1480]: time="2024-12-13T13:32:35.956124267Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:35.957963 containerd[1480]: time="2024-12-13T13:32:35.957855131Z" level=info msg="Ensure that sandbox 3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9 in task-service has been cleanup successfully" Dec 13 13:32:35.961641 containerd[1480]: time="2024-12-13T13:32:35.958338630Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:35.961641 containerd[1480]: time="2024-12-13T13:32:35.958380071Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:35.964179 containerd[1480]: time="2024-12-13T13:32:35.963732871Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:35.963872 systemd[1]: run-netns-cni\x2d29eefd00\x2dddb4\x2d4e2c\x2dff6d\x2d9611ff106862.mount: Deactivated successfully. Dec 13 13:32:35.966622 containerd[1480]: time="2024-12-13T13:32:35.966485734Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:35.966622 containerd[1480]: time="2024-12-13T13:32:35.966540616Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:35.969321 kubelet[2825]: I1213 13:32:35.969096 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b" Dec 13 13:32:35.972242 containerd[1480]: time="2024-12-13T13:32:35.972182107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:2,}" Dec 13 13:32:35.973877 containerd[1480]: time="2024-12-13T13:32:35.973831489Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:35.974490 containerd[1480]: time="2024-12-13T13:32:35.974391630Z" level=info msg="Ensure that sandbox a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b in task-service has been cleanup successfully" Dec 13 13:32:35.986906 containerd[1480]: time="2024-12-13T13:32:35.979831513Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:35.986906 containerd[1480]: time="2024-12-13T13:32:35.979869434Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:35.986479 systemd[1]: run-netns-cni\x2d811103b9\x2d1512\x2d2b30\x2d9e1b\x2d222911038e96.mount: Deactivated successfully. Dec 13 13:32:35.988861 containerd[1480]: time="2024-12-13T13:32:35.988798848Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:35.989155 containerd[1480]: time="2024-12-13T13:32:35.989123660Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:35.989216 containerd[1480]: time="2024-12-13T13:32:35.989202383Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:35.989950 kubelet[2825]: I1213 13:32:35.989922 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd" Dec 13 13:32:35.995815 containerd[1480]: time="2024-12-13T13:32:35.995579421Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:35.997647 containerd[1480]: time="2024-12-13T13:32:35.997592017Z" level=info msg="Ensure that sandbox f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd in task-service has been cleanup successfully" Dec 13 13:32:35.999884 containerd[1480]: time="2024-12-13T13:32:35.999741817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:2,}" Dec 13 13:32:36.002006 containerd[1480]: time="2024-12-13T13:32:36.001947380Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:36.005365 containerd[1480]: time="2024-12-13T13:32:36.005312426Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:36.008712 containerd[1480]: time="2024-12-13T13:32:36.008576348Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:36.009267 containerd[1480]: time="2024-12-13T13:32:36.008963203Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:36.009267 containerd[1480]: time="2024-12-13T13:32:36.009126729Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:36.011581 containerd[1480]: time="2024-12-13T13:32:36.011534539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:2,}" Dec 13 13:32:36.016119 kubelet[2825]: I1213 13:32:36.016083 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464" Dec 13 13:32:36.018481 containerd[1480]: time="2024-12-13T13:32:36.018114306Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:36.018481 containerd[1480]: time="2024-12-13T13:32:36.018317074Z" level=info msg="Ensure that sandbox 011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464 in task-service has been cleanup successfully" Dec 13 13:32:36.018874 containerd[1480]: time="2024-12-13T13:32:36.018807852Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:36.019914 containerd[1480]: time="2024-12-13T13:32:36.019562400Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:36.020276 containerd[1480]: time="2024-12-13T13:32:36.020015537Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:36.020276 containerd[1480]: time="2024-12-13T13:32:36.020143102Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:36.020276 containerd[1480]: time="2024-12-13T13:32:36.020161023Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:36.020951 containerd[1480]: time="2024-12-13T13:32:36.020765365Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:36.020951 containerd[1480]: time="2024-12-13T13:32:36.020882170Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:36.020951 containerd[1480]: time="2024-12-13T13:32:36.020894330Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:36.022371 containerd[1480]: time="2024-12-13T13:32:36.021808885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:3,}" Dec 13 13:32:36.207359 containerd[1480]: time="2024-12-13T13:32:36.206235162Z" level=error msg="Failed to destroy network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.207359 containerd[1480]: time="2024-12-13T13:32:36.206989150Z" level=error msg="encountered an error cleaning up failed sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.207359 containerd[1480]: time="2024-12-13T13:32:36.207062153Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.207569 kubelet[2825]: E1213 13:32:36.207406 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.207569 kubelet[2825]: E1213 13:32:36.207487 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:36.207569 kubelet[2825]: E1213 13:32:36.207523 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:36.207659 kubelet[2825]: E1213 13:32:36.207577 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-xdxpd" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" Dec 13 13:32:36.225558 containerd[1480]: time="2024-12-13T13:32:36.225434602Z" level=error msg="Failed to destroy network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.227637 containerd[1480]: time="2024-12-13T13:32:36.227522761Z" level=error msg="encountered an error cleaning up failed sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.227637 containerd[1480]: time="2024-12-13T13:32:36.227613364Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.228386 kubelet[2825]: E1213 13:32:36.228038 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.228386 kubelet[2825]: E1213 13:32:36.228101 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:36.228386 kubelet[2825]: E1213 13:32:36.228129 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:36.228536 kubelet[2825]: E1213 13:32:36.228187 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podUID="74a12de7-657f-4aae-821f-e260248f542a" Dec 13 13:32:36.306421 containerd[1480]: time="2024-12-13T13:32:36.306227913Z" level=error msg="Failed to destroy network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.307746 containerd[1480]: time="2024-12-13T13:32:36.307616565Z" level=error msg="encountered an error cleaning up failed sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.308444 containerd[1480]: time="2024-12-13T13:32:36.308344952Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.308886 kubelet[2825]: E1213 13:32:36.308807 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.308966 kubelet[2825]: E1213 13:32:36.308912 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:36.308966 kubelet[2825]: E1213 13:32:36.308940 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:36.309023 kubelet[2825]: E1213 13:32:36.309008 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-hgwlz" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" Dec 13 13:32:36.321841 containerd[1480]: time="2024-12-13T13:32:36.321688533Z" level=error msg="Failed to destroy network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.323816 containerd[1480]: time="2024-12-13T13:32:36.323754850Z" level=error msg="encountered an error cleaning up failed sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.323816 containerd[1480]: time="2024-12-13T13:32:36.323905016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.324568 kubelet[2825]: E1213 13:32:36.324531 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.324650 kubelet[2825]: E1213 13:32:36.324600 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:36.324650 kubelet[2825]: E1213 13:32:36.324625 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:36.324696 kubelet[2825]: E1213 13:32:36.324683 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" Dec 13 13:32:36.330465 containerd[1480]: time="2024-12-13T13:32:36.330257854Z" level=error msg="Failed to destroy network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.332379 containerd[1480]: time="2024-12-13T13:32:36.332328372Z" level=error msg="encountered an error cleaning up failed sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.332875 containerd[1480]: time="2024-12-13T13:32:36.332668345Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.333175 kubelet[2825]: E1213 13:32:36.333121 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.333175 kubelet[2825]: E1213 13:32:36.333175 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:36.333377 kubelet[2825]: E1213 13:32:36.333195 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:36.333377 kubelet[2825]: E1213 13:32:36.333260 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:36.366809 containerd[1480]: time="2024-12-13T13:32:36.366757063Z" level=error msg="Failed to destroy network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.368343 containerd[1480]: time="2024-12-13T13:32:36.368126675Z" level=error msg="encountered an error cleaning up failed sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.368343 containerd[1480]: time="2024-12-13T13:32:36.368219838Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.369342 kubelet[2825]: E1213 13:32:36.368777 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:36.369342 kubelet[2825]: E1213 13:32:36.368854 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:36.369342 kubelet[2825]: E1213 13:32:36.368878 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:36.370877 kubelet[2825]: E1213 13:32:36.368935 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podUID="bcc560db-f238-49e4-9766-c00316d8e479" Dec 13 13:32:36.697929 systemd[1]: run-netns-cni\x2dd37e39e8\x2d40ae\x2d4afd\x2def30\x2d39ad6b1af857.mount: Deactivated successfully. Dec 13 13:32:36.698852 systemd[1]: run-netns-cni\x2dc4944efc\x2d7229\x2d8542\x2d6608\x2d4575b202e31a.mount: Deactivated successfully. Dec 13 13:32:37.023133 kubelet[2825]: I1213 13:32:37.023023 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31" Dec 13 13:32:37.027246 containerd[1480]: time="2024-12-13T13:32:37.025354090Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:37.027246 containerd[1480]: time="2024-12-13T13:32:37.026014395Z" level=info msg="Ensure that sandbox 70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31 in task-service has been cleanup successfully" Dec 13 13:32:37.031541 containerd[1480]: time="2024-12-13T13:32:37.030780214Z" level=info msg="TearDown network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" successfully" Dec 13 13:32:37.031541 containerd[1480]: time="2024-12-13T13:32:37.030816935Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" returns successfully" Dec 13 13:32:37.031144 systemd[1]: run-netns-cni\x2d2474b174\x2d518d\x2d8768\x2d34f6\x2dcb2a3c53cb27.mount: Deactivated successfully. Dec 13 13:32:37.033913 containerd[1480]: time="2024-12-13T13:32:37.033254667Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:37.033913 containerd[1480]: time="2024-12-13T13:32:37.033394152Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:37.033913 containerd[1480]: time="2024-12-13T13:32:37.033405313Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:37.034251 containerd[1480]: time="2024-12-13T13:32:37.034183462Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:37.036567 containerd[1480]: time="2024-12-13T13:32:37.034701922Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:37.036567 containerd[1480]: time="2024-12-13T13:32:37.034729163Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:37.036567 containerd[1480]: time="2024-12-13T13:32:37.035314025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:3,}" Dec 13 13:32:37.036773 kubelet[2825]: I1213 13:32:37.035385 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca" Dec 13 13:32:37.037252 containerd[1480]: time="2024-12-13T13:32:37.037217136Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:37.037901 containerd[1480]: time="2024-12-13T13:32:37.037760197Z" level=info msg="Ensure that sandbox 1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca in task-service has been cleanup successfully" Dec 13 13:32:37.039174 containerd[1480]: time="2024-12-13T13:32:37.039105847Z" level=info msg="TearDown network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" successfully" Dec 13 13:32:37.039363 containerd[1480]: time="2024-12-13T13:32:37.039273614Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" returns successfully" Dec 13 13:32:37.040984 containerd[1480]: time="2024-12-13T13:32:37.040803551Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:37.040984 containerd[1480]: time="2024-12-13T13:32:37.040928316Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:37.040984 containerd[1480]: time="2024-12-13T13:32:37.040938956Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:37.042516 containerd[1480]: time="2024-12-13T13:32:37.042439893Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:37.045701 kubelet[2825]: I1213 13:32:37.043998 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe" Dec 13 13:32:37.046019 systemd[1]: run-netns-cni\x2db878f4c3\x2d35c4\x2df7fb\x2d8ef1\x2d7171f14a9542.mount: Deactivated successfully. Dec 13 13:32:37.048438 containerd[1480]: time="2024-12-13T13:32:37.047390799Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:37.049897 containerd[1480]: time="2024-12-13T13:32:37.047838576Z" level=info msg="Ensure that sandbox 0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe in task-service has been cleanup successfully" Dec 13 13:32:37.050024 containerd[1480]: time="2024-12-13T13:32:37.049913854Z" level=info msg="TearDown network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" successfully" Dec 13 13:32:37.050024 containerd[1480]: time="2024-12-13T13:32:37.049935495Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" returns successfully" Dec 13 13:32:37.050024 containerd[1480]: time="2024-12-13T13:32:37.048412078Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:37.050024 containerd[1480]: time="2024-12-13T13:32:37.050003457Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:37.053918 containerd[1480]: time="2024-12-13T13:32:37.053622474Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:37.053918 containerd[1480]: time="2024-12-13T13:32:37.053823641Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:37.053918 containerd[1480]: time="2024-12-13T13:32:37.053834522Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:37.054914 containerd[1480]: time="2024-12-13T13:32:37.053900244Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:37.054914 containerd[1480]: time="2024-12-13T13:32:37.054128773Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:37.054914 containerd[1480]: time="2024-12-13T13:32:37.054139853Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:37.056744 systemd[1]: run-netns-cni\x2d3b50228c\x2dba3b\x2d3539\x2d6cf8\x2d2d444a006b43.mount: Deactivated successfully. Dec 13 13:32:37.059643 containerd[1480]: time="2024-12-13T13:32:37.058752067Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:37.059643 containerd[1480]: time="2024-12-13T13:32:37.058909433Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:37.059643 containerd[1480]: time="2024-12-13T13:32:37.058922313Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:37.059643 containerd[1480]: time="2024-12-13T13:32:37.059095840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:4,}" Dec 13 13:32:37.064215 containerd[1480]: time="2024-12-13T13:32:37.064172391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:3,}" Dec 13 13:32:37.066741 kubelet[2825]: I1213 13:32:37.065882 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087" Dec 13 13:32:37.067309 containerd[1480]: time="2024-12-13T13:32:37.067275668Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:37.068063 containerd[1480]: time="2024-12-13T13:32:37.068022256Z" level=info msg="Ensure that sandbox 9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087 in task-service has been cleanup successfully" Dec 13 13:32:37.069954 containerd[1480]: time="2024-12-13T13:32:37.069352026Z" level=info msg="TearDown network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" successfully" Dec 13 13:32:37.073261 containerd[1480]: time="2024-12-13T13:32:37.072928120Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" returns successfully" Dec 13 13:32:37.073424 kubelet[2825]: I1213 13:32:37.073381 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1" Dec 13 13:32:37.074615 systemd[1]: run-netns-cni\x2d49f24c13\x2d8d43\x2d852c\x2d6e52\x2df8800cf96e5f.mount: Deactivated successfully. Dec 13 13:32:37.075003 containerd[1480]: time="2024-12-13T13:32:37.074958997Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:37.076783 containerd[1480]: time="2024-12-13T13:32:37.076075919Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:37.076783 containerd[1480]: time="2024-12-13T13:32:37.076730583Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:37.078753 containerd[1480]: time="2024-12-13T13:32:37.078402486Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:37.078753 containerd[1480]: time="2024-12-13T13:32:37.078724458Z" level=info msg="Ensure that sandbox 78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1 in task-service has been cleanup successfully" Dec 13 13:32:37.079641 containerd[1480]: time="2024-12-13T13:32:37.079353042Z" level=info msg="TearDown network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" successfully" Dec 13 13:32:37.079641 containerd[1480]: time="2024-12-13T13:32:37.079381803Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" returns successfully" Dec 13 13:32:37.080887 containerd[1480]: time="2024-12-13T13:32:37.080782936Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:37.081396 containerd[1480]: time="2024-12-13T13:32:37.081267154Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:37.081396 containerd[1480]: time="2024-12-13T13:32:37.081293955Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:37.082289 containerd[1480]: time="2024-12-13T13:32:37.082102746Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:37.082890 containerd[1480]: time="2024-12-13T13:32:37.082767771Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:37.083149 containerd[1480]: time="2024-12-13T13:32:37.083130144Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:37.083737 containerd[1480]: time="2024-12-13T13:32:37.083381994Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:37.084764 containerd[1480]: time="2024-12-13T13:32:37.084738645Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:37.084949 containerd[1480]: time="2024-12-13T13:32:37.084928932Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:37.086908 containerd[1480]: time="2024-12-13T13:32:37.086801002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:3,}" Dec 13 13:32:37.087605 containerd[1480]: time="2024-12-13T13:32:37.087096974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:3,}" Dec 13 13:32:37.089527 kubelet[2825]: I1213 13:32:37.088702 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459" Dec 13 13:32:37.090026 containerd[1480]: time="2024-12-13T13:32:37.089987442Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:37.090364 containerd[1480]: time="2024-12-13T13:32:37.090333455Z" level=info msg="Ensure that sandbox e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459 in task-service has been cleanup successfully" Dec 13 13:32:37.090976 containerd[1480]: time="2024-12-13T13:32:37.090948799Z" level=info msg="TearDown network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" successfully" Dec 13 13:32:37.091079 containerd[1480]: time="2024-12-13T13:32:37.091064083Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" returns successfully" Dec 13 13:32:37.093011 containerd[1480]: time="2024-12-13T13:32:37.092957274Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:37.094189 containerd[1480]: time="2024-12-13T13:32:37.094142119Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:37.094189 containerd[1480]: time="2024-12-13T13:32:37.094179120Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:37.096858 containerd[1480]: time="2024-12-13T13:32:37.096662774Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:37.097241 containerd[1480]: time="2024-12-13T13:32:37.096960505Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:37.097241 containerd[1480]: time="2024-12-13T13:32:37.096978825Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:37.101096 containerd[1480]: time="2024-12-13T13:32:37.101045379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:3,}" Dec 13 13:32:37.346829 containerd[1480]: time="2024-12-13T13:32:37.345491579Z" level=error msg="Failed to destroy network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.353300 containerd[1480]: time="2024-12-13T13:32:37.353217310Z" level=error msg="encountered an error cleaning up failed sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.353644 containerd[1480]: time="2024-12-13T13:32:37.353610364Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.359348 kubelet[2825]: E1213 13:32:37.358933 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.359348 kubelet[2825]: E1213 13:32:37.359004 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:37.359348 kubelet[2825]: E1213 13:32:37.359030 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:37.359579 kubelet[2825]: E1213 13:32:37.359093 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" Dec 13 13:32:37.388535 containerd[1480]: time="2024-12-13T13:32:37.388187386Z" level=error msg="Failed to destroy network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.389610 containerd[1480]: time="2024-12-13T13:32:37.389278987Z" level=error msg="encountered an error cleaning up failed sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.389876 containerd[1480]: time="2024-12-13T13:32:37.389759965Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.390734 kubelet[2825]: E1213 13:32:37.390283 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.390734 kubelet[2825]: E1213 13:32:37.390348 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:37.390734 kubelet[2825]: E1213 13:32:37.390368 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:37.390964 kubelet[2825]: E1213 13:32:37.390430 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-xdxpd" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" Dec 13 13:32:37.414830 containerd[1480]: time="2024-12-13T13:32:37.414670022Z" level=error msg="Failed to destroy network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.415436 containerd[1480]: time="2024-12-13T13:32:37.415236284Z" level=error msg="encountered an error cleaning up failed sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.415436 containerd[1480]: time="2024-12-13T13:32:37.415314167Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.415736 kubelet[2825]: E1213 13:32:37.415648 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.415736 kubelet[2825]: E1213 13:32:37.415712 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:37.415736 kubelet[2825]: E1213 13:32:37.415740 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:37.416073 kubelet[2825]: E1213 13:32:37.415809 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podUID="bcc560db-f238-49e4-9766-c00316d8e479" Dec 13 13:32:37.428576 containerd[1480]: time="2024-12-13T13:32:37.428417340Z" level=error msg="Failed to destroy network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.429836 containerd[1480]: time="2024-12-13T13:32:37.429589144Z" level=error msg="encountered an error cleaning up failed sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.430076 containerd[1480]: time="2024-12-13T13:32:37.430036201Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.431731 kubelet[2825]: E1213 13:32:37.431700 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.432032 kubelet[2825]: E1213 13:32:37.432015 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:37.432160 kubelet[2825]: E1213 13:32:37.432136 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:37.432572 kubelet[2825]: E1213 13:32:37.432302 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podUID="74a12de7-657f-4aae-821f-e260248f542a" Dec 13 13:32:37.433759 containerd[1480]: time="2024-12-13T13:32:37.433683738Z" level=error msg="Failed to destroy network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.434251 containerd[1480]: time="2024-12-13T13:32:37.434120635Z" level=error msg="encountered an error cleaning up failed sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.434251 containerd[1480]: time="2024-12-13T13:32:37.434204678Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.434604 kubelet[2825]: E1213 13:32:37.434461 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.434604 kubelet[2825]: E1213 13:32:37.434585 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:37.434604 kubelet[2825]: E1213 13:32:37.434607 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:37.434959 kubelet[2825]: E1213 13:32:37.434663 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:37.461096 containerd[1480]: time="2024-12-13T13:32:37.460776478Z" level=error msg="Failed to destroy network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.462375 containerd[1480]: time="2024-12-13T13:32:37.462293055Z" level=error msg="encountered an error cleaning up failed sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.462630 containerd[1480]: time="2024-12-13T13:32:37.462603067Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.462985 kubelet[2825]: E1213 13:32:37.462939 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:37.463052 kubelet[2825]: E1213 13:32:37.462998 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:37.463052 kubelet[2825]: E1213 13:32:37.463020 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:37.463128 kubelet[2825]: E1213 13:32:37.463080 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-hgwlz" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" Dec 13 13:32:37.697248 systemd[1]: run-netns-cni\x2da46b244f\x2dff6d\x2df5aa\x2d1ae4\x2d6cf710d1d86d.mount: Deactivated successfully. Dec 13 13:32:37.697366 systemd[1]: run-netns-cni\x2d9c036ed4\x2d8ef1\x2d5806\x2d8c7d\x2dc99ee718b118.mount: Deactivated successfully. Dec 13 13:32:38.095325 kubelet[2825]: I1213 13:32:38.095164 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5" Dec 13 13:32:38.098559 containerd[1480]: time="2024-12-13T13:32:38.096548578Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" Dec 13 13:32:38.098559 containerd[1480]: time="2024-12-13T13:32:38.097341648Z" level=info msg="Ensure that sandbox 11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5 in task-service has been cleanup successfully" Dec 13 13:32:38.101460 containerd[1480]: time="2024-12-13T13:32:38.100939984Z" level=info msg="TearDown network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" successfully" Dec 13 13:32:38.101460 containerd[1480]: time="2024-12-13T13:32:38.101016947Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" returns successfully" Dec 13 13:32:38.102460 containerd[1480]: time="2024-12-13T13:32:38.102115069Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:38.102460 containerd[1480]: time="2024-12-13T13:32:38.102205352Z" level=info msg="TearDown network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" successfully" Dec 13 13:32:38.102460 containerd[1480]: time="2024-12-13T13:32:38.102216792Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" returns successfully" Dec 13 13:32:38.103500 containerd[1480]: time="2024-12-13T13:32:38.103349795Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:38.105525 containerd[1480]: time="2024-12-13T13:32:38.105458635Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:38.105822 containerd[1480]: time="2024-12-13T13:32:38.105633281Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:38.106770 containerd[1480]: time="2024-12-13T13:32:38.106105299Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:38.106770 containerd[1480]: time="2024-12-13T13:32:38.106198823Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:38.106770 containerd[1480]: time="2024-12-13T13:32:38.106208223Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:38.106762 systemd[1]: run-netns-cni\x2dc2cdeece\x2d15fa\x2d1da9\x2dfa3f\x2d724de94bc60d.mount: Deactivated successfully. Dec 13 13:32:38.109400 kubelet[2825]: I1213 13:32:38.107220 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0" Dec 13 13:32:38.110073 containerd[1480]: time="2024-12-13T13:32:38.109589111Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" Dec 13 13:32:38.110073 containerd[1480]: time="2024-12-13T13:32:38.109861441Z" level=info msg="Ensure that sandbox 3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0 in task-service has been cleanup successfully" Dec 13 13:32:38.110224 containerd[1480]: time="2024-12-13T13:32:38.110184613Z" level=info msg="TearDown network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" successfully" Dec 13 13:32:38.110224 containerd[1480]: time="2024-12-13T13:32:38.110216694Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" returns successfully" Dec 13 13:32:38.112678 containerd[1480]: time="2024-12-13T13:32:38.110727234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:4,}" Dec 13 13:32:38.115091 containerd[1480]: time="2024-12-13T13:32:38.115039037Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:38.116846 systemd[1]: run-netns-cni\x2d7d7a921c\x2d29db\x2d0348\x2decb1\x2d165880a656f9.mount: Deactivated successfully. Dec 13 13:32:38.118155 containerd[1480]: time="2024-12-13T13:32:38.118107712Z" level=info msg="TearDown network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" successfully" Dec 13 13:32:38.118155 containerd[1480]: time="2024-12-13T13:32:38.118146154Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" returns successfully" Dec 13 13:32:38.120148 containerd[1480]: time="2024-12-13T13:32:38.119665051Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:38.120148 containerd[1480]: time="2024-12-13T13:32:38.119794736Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:38.120148 containerd[1480]: time="2024-12-13T13:32:38.119807217Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:38.123775 kubelet[2825]: I1213 13:32:38.122699 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306" Dec 13 13:32:38.123942 containerd[1480]: time="2024-12-13T13:32:38.122558681Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:38.123942 containerd[1480]: time="2024-12-13T13:32:38.123655882Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:38.123942 containerd[1480]: time="2024-12-13T13:32:38.123682803Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:38.124990 containerd[1480]: time="2024-12-13T13:32:38.124954691Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" Dec 13 13:32:38.125229 containerd[1480]: time="2024-12-13T13:32:38.125164979Z" level=info msg="Ensure that sandbox 5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306 in task-service has been cleanup successfully" Dec 13 13:32:38.129225 containerd[1480]: time="2024-12-13T13:32:38.128958562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:4,}" Dec 13 13:32:38.129240 systemd[1]: run-netns-cni\x2d5cd8b77a\x2d5045\x2d2bbf\x2d99ed\x2d2e78280d5b10.mount: Deactivated successfully. Dec 13 13:32:38.131889 containerd[1480]: time="2024-12-13T13:32:38.130099605Z" level=info msg="TearDown network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" successfully" Dec 13 13:32:38.131889 containerd[1480]: time="2024-12-13T13:32:38.130132447Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" returns successfully" Dec 13 13:32:38.134955 containerd[1480]: time="2024-12-13T13:32:38.134777622Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:38.134955 containerd[1480]: time="2024-12-13T13:32:38.134921987Z" level=info msg="TearDown network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" successfully" Dec 13 13:32:38.134955 containerd[1480]: time="2024-12-13T13:32:38.134935508Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" returns successfully" Dec 13 13:32:38.136034 containerd[1480]: time="2024-12-13T13:32:38.135759859Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:38.136034 containerd[1480]: time="2024-12-13T13:32:38.135862663Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:38.136034 containerd[1480]: time="2024-12-13T13:32:38.135889984Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:38.136970 containerd[1480]: time="2024-12-13T13:32:38.136929503Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:38.137459 containerd[1480]: time="2024-12-13T13:32:38.137338959Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:38.137459 containerd[1480]: time="2024-12-13T13:32:38.137370440Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:38.139045 kubelet[2825]: I1213 13:32:38.139017 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933" Dec 13 13:32:38.140132 containerd[1480]: time="2024-12-13T13:32:38.139902415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:4,}" Dec 13 13:32:38.140595 containerd[1480]: time="2024-12-13T13:32:38.140359873Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" Dec 13 13:32:38.140595 containerd[1480]: time="2024-12-13T13:32:38.140568001Z" level=info msg="Ensure that sandbox 685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933 in task-service has been cleanup successfully" Dec 13 13:32:38.144540 systemd[1]: run-netns-cni\x2d287635ab\x2df7bc\x2d0c8d\x2d9ced\x2dace3d52d8ef5.mount: Deactivated successfully. Dec 13 13:32:38.146800 containerd[1480]: time="2024-12-13T13:32:38.146389500Z" level=info msg="TearDown network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" successfully" Dec 13 13:32:38.146800 containerd[1480]: time="2024-12-13T13:32:38.146430342Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" returns successfully" Dec 13 13:32:38.149850 containerd[1480]: time="2024-12-13T13:32:38.149085842Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:38.149850 containerd[1480]: time="2024-12-13T13:32:38.149204607Z" level=info msg="TearDown network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" successfully" Dec 13 13:32:38.149850 containerd[1480]: time="2024-12-13T13:32:38.149215087Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" returns successfully" Dec 13 13:32:38.151565 containerd[1480]: time="2024-12-13T13:32:38.151150800Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:38.151565 containerd[1480]: time="2024-12-13T13:32:38.151249684Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:38.151565 containerd[1480]: time="2024-12-13T13:32:38.151260284Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:38.152837 containerd[1480]: time="2024-12-13T13:32:38.152341965Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:38.152837 containerd[1480]: time="2024-12-13T13:32:38.152439249Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:38.152837 containerd[1480]: time="2024-12-13T13:32:38.152448249Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:38.154268 kubelet[2825]: I1213 13:32:38.154235 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80" Dec 13 13:32:38.154365 containerd[1480]: time="2024-12-13T13:32:38.154278438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:4,}" Dec 13 13:32:38.157933 containerd[1480]: time="2024-12-13T13:32:38.156995101Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" Dec 13 13:32:38.157933 containerd[1480]: time="2024-12-13T13:32:38.157553322Z" level=info msg="Ensure that sandbox a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80 in task-service has been cleanup successfully" Dec 13 13:32:38.160359 containerd[1480]: time="2024-12-13T13:32:38.160243544Z" level=info msg="TearDown network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" successfully" Dec 13 13:32:38.160359 containerd[1480]: time="2024-12-13T13:32:38.160302306Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" returns successfully" Dec 13 13:32:38.162204 containerd[1480]: time="2024-12-13T13:32:38.162061172Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:38.162204 containerd[1480]: time="2024-12-13T13:32:38.162204378Z" level=info msg="TearDown network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" successfully" Dec 13 13:32:38.162204 containerd[1480]: time="2024-12-13T13:32:38.162216778Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" returns successfully" Dec 13 13:32:38.163366 containerd[1480]: time="2024-12-13T13:32:38.163080891Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:38.163366 containerd[1480]: time="2024-12-13T13:32:38.163219536Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:38.163366 containerd[1480]: time="2024-12-13T13:32:38.163232616Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:38.164767 containerd[1480]: time="2024-12-13T13:32:38.164705672Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:38.164952 containerd[1480]: time="2024-12-13T13:32:38.164850918Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:38.164952 containerd[1480]: time="2024-12-13T13:32:38.164862358Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:38.168672 containerd[1480]: time="2024-12-13T13:32:38.168571858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:4,}" Dec 13 13:32:38.172543 kubelet[2825]: I1213 13:32:38.171552 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1" Dec 13 13:32:38.175345 containerd[1480]: time="2024-12-13T13:32:38.175264631Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" Dec 13 13:32:38.175804 containerd[1480]: time="2024-12-13T13:32:38.175624444Z" level=info msg="Ensure that sandbox 79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1 in task-service has been cleanup successfully" Dec 13 13:32:38.175999 containerd[1480]: time="2024-12-13T13:32:38.175899775Z" level=info msg="TearDown network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" successfully" Dec 13 13:32:38.175999 containerd[1480]: time="2024-12-13T13:32:38.175939816Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" returns successfully" Dec 13 13:32:38.182644 containerd[1480]: time="2024-12-13T13:32:38.182591347Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:38.182840 containerd[1480]: time="2024-12-13T13:32:38.182713632Z" level=info msg="TearDown network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" successfully" Dec 13 13:32:38.182840 containerd[1480]: time="2024-12-13T13:32:38.182723632Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" returns successfully" Dec 13 13:32:38.185939 containerd[1480]: time="2024-12-13T13:32:38.185635822Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:38.185939 containerd[1480]: time="2024-12-13T13:32:38.185765507Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:38.185939 containerd[1480]: time="2024-12-13T13:32:38.185776708Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:38.187189 containerd[1480]: time="2024-12-13T13:32:38.186808347Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:38.187189 containerd[1480]: time="2024-12-13T13:32:38.186943472Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:38.187189 containerd[1480]: time="2024-12-13T13:32:38.186954912Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:38.187967 containerd[1480]: time="2024-12-13T13:32:38.187747822Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:38.187967 containerd[1480]: time="2024-12-13T13:32:38.187852386Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:38.187967 containerd[1480]: time="2024-12-13T13:32:38.187865027Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:38.189088 containerd[1480]: time="2024-12-13T13:32:38.189048191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:5,}" Dec 13 13:32:38.377670 containerd[1480]: time="2024-12-13T13:32:38.377315420Z" level=error msg="Failed to destroy network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.384897 containerd[1480]: time="2024-12-13T13:32:38.384835984Z" level=error msg="encountered an error cleaning up failed sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.385149 containerd[1480]: time="2024-12-13T13:32:38.385125315Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.385700 kubelet[2825]: E1213 13:32:38.385527 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.385700 kubelet[2825]: E1213 13:32:38.385587 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:38.385700 kubelet[2825]: E1213 13:32:38.385607 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:38.385850 kubelet[2825]: E1213 13:32:38.385673 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-hgwlz" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" Dec 13 13:32:38.414301 containerd[1480]: time="2024-12-13T13:32:38.413644872Z" level=error msg="Failed to destroy network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.415511 containerd[1480]: time="2024-12-13T13:32:38.415285654Z" level=error msg="encountered an error cleaning up failed sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.416413 containerd[1480]: time="2024-12-13T13:32:38.416366695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.416896 kubelet[2825]: E1213 13:32:38.416857 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.417311 kubelet[2825]: E1213 13:32:38.417183 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:38.417311 kubelet[2825]: E1213 13:32:38.417212 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:38.417766 kubelet[2825]: E1213 13:32:38.417420 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-xdxpd" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" Dec 13 13:32:38.455427 containerd[1480]: time="2024-12-13T13:32:38.455153400Z" level=error msg="Failed to destroy network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.461231 containerd[1480]: time="2024-12-13T13:32:38.460594245Z" level=error msg="encountered an error cleaning up failed sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.461231 containerd[1480]: time="2024-12-13T13:32:38.460696329Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.461435 kubelet[2825]: E1213 13:32:38.461035 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.461435 kubelet[2825]: E1213 13:32:38.461102 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:38.461435 kubelet[2825]: E1213 13:32:38.461130 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:38.461556 kubelet[2825]: E1213 13:32:38.461192 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podUID="74a12de7-657f-4aae-821f-e260248f542a" Dec 13 13:32:38.480226 containerd[1480]: time="2024-12-13T13:32:38.479938136Z" level=error msg="Failed to destroy network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.482160 containerd[1480]: time="2024-12-13T13:32:38.481705242Z" level=error msg="encountered an error cleaning up failed sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.482436 containerd[1480]: time="2024-12-13T13:32:38.482400749Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.484040 kubelet[2825]: E1213 13:32:38.482855 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.484040 kubelet[2825]: E1213 13:32:38.482929 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:38.484040 kubelet[2825]: E1213 13:32:38.482952 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:38.484237 kubelet[2825]: E1213 13:32:38.483006 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" Dec 13 13:32:38.507446 containerd[1480]: time="2024-12-13T13:32:38.507393972Z" level=error msg="Failed to destroy network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.508097 containerd[1480]: time="2024-12-13T13:32:38.508064158Z" level=error msg="encountered an error cleaning up failed sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.508331 containerd[1480]: time="2024-12-13T13:32:38.508308807Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.508759 kubelet[2825]: E1213 13:32:38.508724 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.508873 kubelet[2825]: E1213 13:32:38.508786 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:38.508873 kubelet[2825]: E1213 13:32:38.508810 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:38.508967 kubelet[2825]: E1213 13:32:38.508899 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podUID="bcc560db-f238-49e4-9766-c00316d8e479" Dec 13 13:32:38.518071 containerd[1480]: time="2024-12-13T13:32:38.517939291Z" level=error msg="Failed to destroy network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.519318 containerd[1480]: time="2024-12-13T13:32:38.519115615Z" level=error msg="encountered an error cleaning up failed sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.519318 containerd[1480]: time="2024-12-13T13:32:38.519204578Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.519557 kubelet[2825]: E1213 13:32:38.519460 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:38.519603 kubelet[2825]: E1213 13:32:38.519575 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:38.519603 kubelet[2825]: E1213 13:32:38.519598 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:38.519921 kubelet[2825]: E1213 13:32:38.519669 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:38.699735 systemd[1]: run-netns-cni\x2ddeef1808\x2d64cd\x2dc070\x2d948d\x2d3e6a857bfa61.mount: Deactivated successfully. Dec 13 13:32:38.699833 systemd[1]: run-netns-cni\x2dedb7a0e5\x2d4f32\x2d1116\x2d2090\x2d3b7eba8af1b2.mount: Deactivated successfully. Dec 13 13:32:39.180535 kubelet[2825]: I1213 13:32:39.180394 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c" Dec 13 13:32:39.189235 containerd[1480]: time="2024-12-13T13:32:39.189102977Z" level=info msg="StopPodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\"" Dec 13 13:32:39.190546 containerd[1480]: time="2024-12-13T13:32:39.190181378Z" level=info msg="Ensure that sandbox aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c in task-service has been cleanup successfully" Dec 13 13:32:39.192685 containerd[1480]: time="2024-12-13T13:32:39.192638191Z" level=info msg="TearDown network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" successfully" Dec 13 13:32:39.193129 containerd[1480]: time="2024-12-13T13:32:39.192886281Z" level=info msg="StopPodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" returns successfully" Dec 13 13:32:39.193221 systemd[1]: run-netns-cni\x2d21a3aec0\x2de4c6\x2de6c2\x2df50c\x2d1dee48e8279c.mount: Deactivated successfully. Dec 13 13:32:39.195780 containerd[1480]: time="2024-12-13T13:32:39.195664786Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" Dec 13 13:32:39.195904 containerd[1480]: time="2024-12-13T13:32:39.195784230Z" level=info msg="TearDown network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" successfully" Dec 13 13:32:39.195904 containerd[1480]: time="2024-12-13T13:32:39.195795311Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" returns successfully" Dec 13 13:32:39.199162 containerd[1480]: time="2024-12-13T13:32:39.198742182Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:39.199162 containerd[1480]: time="2024-12-13T13:32:39.198860667Z" level=info msg="TearDown network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" successfully" Dec 13 13:32:39.199162 containerd[1480]: time="2024-12-13T13:32:39.198872427Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" returns successfully" Dec 13 13:32:39.200147 containerd[1480]: time="2024-12-13T13:32:39.199881026Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:39.200147 containerd[1480]: time="2024-12-13T13:32:39.200002110Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:39.200147 containerd[1480]: time="2024-12-13T13:32:39.200012791Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:39.200733 containerd[1480]: time="2024-12-13T13:32:39.200442047Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:39.201009 containerd[1480]: time="2024-12-13T13:32:39.200876863Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:39.201009 containerd[1480]: time="2024-12-13T13:32:39.200941426Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:39.201671 containerd[1480]: time="2024-12-13T13:32:39.201642372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:5,}" Dec 13 13:32:39.202289 kubelet[2825]: I1213 13:32:39.202251 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc" Dec 13 13:32:39.208199 containerd[1480]: time="2024-12-13T13:32:39.208130178Z" level=info msg="StopPodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\"" Dec 13 13:32:39.208570 containerd[1480]: time="2024-12-13T13:32:39.208320705Z" level=info msg="Ensure that sandbox a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc in task-service has been cleanup successfully" Dec 13 13:32:39.214242 containerd[1480]: time="2024-12-13T13:32:39.212244254Z" level=info msg="TearDown network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" successfully" Dec 13 13:32:39.214242 containerd[1480]: time="2024-12-13T13:32:39.213378377Z" level=info msg="StopPodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" returns successfully" Dec 13 13:32:39.214602 systemd[1]: run-netns-cni\x2d6ed6647f\x2d9a09\x2dd299\x2d9158\x2d6934c3338ede.mount: Deactivated successfully. Dec 13 13:32:39.222230 containerd[1480]: time="2024-12-13T13:32:39.221832897Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" Dec 13 13:32:39.222230 containerd[1480]: time="2024-12-13T13:32:39.222026104Z" level=info msg="TearDown network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" successfully" Dec 13 13:32:39.222230 containerd[1480]: time="2024-12-13T13:32:39.222040865Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" returns successfully" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.222439720Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.222562085Z" level=info msg="TearDown network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" successfully" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.222573405Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" returns successfully" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.222824735Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.222952700Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.222964900Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.223245831Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.223319753Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:39.223959 containerd[1480]: time="2024-12-13T13:32:39.223328914Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:39.224847 containerd[1480]: time="2024-12-13T13:32:39.224459277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:5,}" Dec 13 13:32:39.233953 kubelet[2825]: I1213 13:32:39.233386 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3" Dec 13 13:32:39.235581 containerd[1480]: time="2024-12-13T13:32:39.235443133Z" level=info msg="StopPodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\"" Dec 13 13:32:39.236071 containerd[1480]: time="2024-12-13T13:32:39.236039755Z" level=info msg="Ensure that sandbox 493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3 in task-service has been cleanup successfully" Dec 13 13:32:39.240616 containerd[1480]: time="2024-12-13T13:32:39.240291916Z" level=info msg="TearDown network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" successfully" Dec 13 13:32:39.240616 containerd[1480]: time="2024-12-13T13:32:39.240455163Z" level=info msg="StopPodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" returns successfully" Dec 13 13:32:39.241720 systemd[1]: run-netns-cni\x2d92367074\x2db3a4\x2d9763\x2d6bc3\x2d15ac1479f625.mount: Deactivated successfully. Dec 13 13:32:39.244475 containerd[1480]: time="2024-12-13T13:32:39.244427633Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" Dec 13 13:32:39.245214 containerd[1480]: time="2024-12-13T13:32:39.244796847Z" level=info msg="TearDown network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" successfully" Dec 13 13:32:39.245214 containerd[1480]: time="2024-12-13T13:32:39.244822608Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" returns successfully" Dec 13 13:32:39.246342 containerd[1480]: time="2024-12-13T13:32:39.245947251Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:39.246872 containerd[1480]: time="2024-12-13T13:32:39.246743441Z" level=info msg="TearDown network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" successfully" Dec 13 13:32:39.246872 containerd[1480]: time="2024-12-13T13:32:39.246772762Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" returns successfully" Dec 13 13:32:39.248311 containerd[1480]: time="2024-12-13T13:32:39.248152694Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:39.248633 containerd[1480]: time="2024-12-13T13:32:39.248586431Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:39.248633 containerd[1480]: time="2024-12-13T13:32:39.248609911Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:39.251100 containerd[1480]: time="2024-12-13T13:32:39.250885118Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:39.253600 kubelet[2825]: I1213 13:32:39.253298 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d" Dec 13 13:32:39.253781 containerd[1480]: time="2024-12-13T13:32:39.253526538Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:39.253781 containerd[1480]: time="2024-12-13T13:32:39.253558659Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:39.256518 containerd[1480]: time="2024-12-13T13:32:39.255872867Z" level=info msg="StopPodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\"" Dec 13 13:32:39.256518 containerd[1480]: time="2024-12-13T13:32:39.256167758Z" level=info msg="Ensure that sandbox 47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d in task-service has been cleanup successfully" Dec 13 13:32:39.262960 containerd[1480]: time="2024-12-13T13:32:39.262807969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:5,}" Dec 13 13:32:39.266538 systemd[1]: run-netns-cni\x2d28971cec\x2d4638\x2d82bb\x2d6c80\x2da8fc0f9a4bc3.mount: Deactivated successfully. Dec 13 13:32:39.273664 containerd[1480]: time="2024-12-13T13:32:39.273303927Z" level=info msg="TearDown network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" successfully" Dec 13 13:32:39.273664 containerd[1480]: time="2024-12-13T13:32:39.273347009Z" level=info msg="StopPodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" returns successfully" Dec 13 13:32:39.278841 containerd[1480]: time="2024-12-13T13:32:39.277795617Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" Dec 13 13:32:39.280722 containerd[1480]: time="2024-12-13T13:32:39.279174509Z" level=info msg="TearDown network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" successfully" Dec 13 13:32:39.280722 containerd[1480]: time="2024-12-13T13:32:39.280206708Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" returns successfully" Dec 13 13:32:39.294631 containerd[1480]: time="2024-12-13T13:32:39.294115755Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:39.294631 containerd[1480]: time="2024-12-13T13:32:39.294230760Z" level=info msg="TearDown network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" successfully" Dec 13 13:32:39.294631 containerd[1480]: time="2024-12-13T13:32:39.294242600Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" returns successfully" Dec 13 13:32:39.298276 containerd[1480]: time="2024-12-13T13:32:39.297801135Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:39.298456 containerd[1480]: time="2024-12-13T13:32:39.297943660Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:39.298456 containerd[1480]: time="2024-12-13T13:32:39.298341675Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:39.300078 kubelet[2825]: I1213 13:32:39.298765 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d" Dec 13 13:32:39.302699 containerd[1480]: time="2024-12-13T13:32:39.301373350Z" level=info msg="StopPodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\"" Dec 13 13:32:39.303688 containerd[1480]: time="2024-12-13T13:32:39.303410147Z" level=info msg="Ensure that sandbox 0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d in task-service has been cleanup successfully" Dec 13 13:32:39.304054 containerd[1480]: time="2024-12-13T13:32:39.304015290Z" level=info msg="TearDown network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" successfully" Dec 13 13:32:39.304225 containerd[1480]: time="2024-12-13T13:32:39.304048212Z" level=info msg="StopPodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" returns successfully" Dec 13 13:32:39.306154 containerd[1480]: time="2024-12-13T13:32:39.305667433Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:39.306441 containerd[1480]: time="2024-12-13T13:32:39.306403181Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:39.306486 containerd[1480]: time="2024-12-13T13:32:39.306438462Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:39.308400 containerd[1480]: time="2024-12-13T13:32:39.308318733Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" Dec 13 13:32:39.310184 containerd[1480]: time="2024-12-13T13:32:39.309983636Z" level=info msg="TearDown network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" successfully" Dec 13 13:32:39.310184 containerd[1480]: time="2024-12-13T13:32:39.310134362Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" returns successfully" Dec 13 13:32:39.312190 containerd[1480]: time="2024-12-13T13:32:39.312125398Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:39.313042 containerd[1480]: time="2024-12-13T13:32:39.312812784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:5,}" Dec 13 13:32:39.317048 containerd[1480]: time="2024-12-13T13:32:39.315956783Z" level=info msg="TearDown network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" successfully" Dec 13 13:32:39.317048 containerd[1480]: time="2024-12-13T13:32:39.316020585Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" returns successfully" Dec 13 13:32:39.323719 containerd[1480]: time="2024-12-13T13:32:39.323656674Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:39.324099 containerd[1480]: time="2024-12-13T13:32:39.323888563Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:39.324099 containerd[1480]: time="2024-12-13T13:32:39.324092251Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:39.327141 containerd[1480]: time="2024-12-13T13:32:39.327073004Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:39.333345 containerd[1480]: time="2024-12-13T13:32:39.333105392Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:39.333345 containerd[1480]: time="2024-12-13T13:32:39.333153114Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:39.337066 containerd[1480]: time="2024-12-13T13:32:39.336778811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:5,}" Dec 13 13:32:39.345621 kubelet[2825]: I1213 13:32:39.344792 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054" Dec 13 13:32:39.349941 containerd[1480]: time="2024-12-13T13:32:39.349857587Z" level=info msg="StopPodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\"" Dec 13 13:32:39.350416 containerd[1480]: time="2024-12-13T13:32:39.350387247Z" level=info msg="Ensure that sandbox 56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054 in task-service has been cleanup successfully" Dec 13 13:32:39.351128 containerd[1480]: time="2024-12-13T13:32:39.351075753Z" level=info msg="TearDown network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" successfully" Dec 13 13:32:39.351128 containerd[1480]: time="2024-12-13T13:32:39.351110434Z" level=info msg="StopPodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" returns successfully" Dec 13 13:32:39.354147 containerd[1480]: time="2024-12-13T13:32:39.353965703Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" Dec 13 13:32:39.354309 containerd[1480]: time="2024-12-13T13:32:39.354231753Z" level=info msg="TearDown network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" successfully" Dec 13 13:32:39.354309 containerd[1480]: time="2024-12-13T13:32:39.354246593Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" returns successfully" Dec 13 13:32:39.358207 containerd[1480]: time="2024-12-13T13:32:39.358047617Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:39.358207 containerd[1480]: time="2024-12-13T13:32:39.358189343Z" level=info msg="TearDown network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" successfully" Dec 13 13:32:39.358207 containerd[1480]: time="2024-12-13T13:32:39.358202143Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" returns successfully" Dec 13 13:32:39.361239 containerd[1480]: time="2024-12-13T13:32:39.361034330Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:39.361239 containerd[1480]: time="2024-12-13T13:32:39.361159535Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:39.361239 containerd[1480]: time="2024-12-13T13:32:39.361169015Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:39.362996 containerd[1480]: time="2024-12-13T13:32:39.362721514Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:39.362996 containerd[1480]: time="2024-12-13T13:32:39.362871120Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:39.362996 containerd[1480]: time="2024-12-13T13:32:39.362884760Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:39.367052 containerd[1480]: time="2024-12-13T13:32:39.366811629Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:39.367915 containerd[1480]: time="2024-12-13T13:32:39.367675702Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:39.367915 containerd[1480]: time="2024-12-13T13:32:39.367721104Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:39.370921 containerd[1480]: time="2024-12-13T13:32:39.370340323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:6,}" Dec 13 13:32:39.423920 containerd[1480]: time="2024-12-13T13:32:39.423764107Z" level=error msg="Failed to destroy network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.427746 containerd[1480]: time="2024-12-13T13:32:39.426981708Z" level=error msg="encountered an error cleaning up failed sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.427746 containerd[1480]: time="2024-12-13T13:32:39.427064032Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.430192 kubelet[2825]: E1213 13:32:39.429760 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.430192 kubelet[2825]: E1213 13:32:39.429861 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:39.430192 kubelet[2825]: E1213 13:32:39.429888 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-xdxpd" Dec 13 13:32:39.430608 kubelet[2825]: E1213 13:32:39.429970 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-xdxpd_kube-system(8d7d06e0-d386-4db3-9635-acc914ab1f58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-xdxpd" podUID="8d7d06e0-d386-4db3-9635-acc914ab1f58" Dec 13 13:32:39.628263 containerd[1480]: time="2024-12-13T13:32:39.628192531Z" level=error msg="Failed to destroy network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.630845 containerd[1480]: time="2024-12-13T13:32:39.630737307Z" level=error msg="encountered an error cleaning up failed sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.631462 containerd[1480]: time="2024-12-13T13:32:39.631137242Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.633114 kubelet[2825]: E1213 13:32:39.632192 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.633114 kubelet[2825]: E1213 13:32:39.632256 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:39.633114 kubelet[2825]: E1213 13:32:39.632292 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" Dec 13 13:32:39.633346 kubelet[2825]: E1213 13:32:39.632350 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-8w5p4_calico-apiserver(74a12de7-657f-4aae-821f-e260248f542a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podUID="74a12de7-657f-4aae-821f-e260248f542a" Dec 13 13:32:39.654427 containerd[1480]: time="2024-12-13T13:32:39.654382003Z" level=error msg="Failed to destroy network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.655761 containerd[1480]: time="2024-12-13T13:32:39.655358360Z" level=error msg="encountered an error cleaning up failed sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.655761 containerd[1480]: time="2024-12-13T13:32:39.655434363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.656028 kubelet[2825]: E1213 13:32:39.655813 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.656028 kubelet[2825]: E1213 13:32:39.655887 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:39.656028 kubelet[2825]: E1213 13:32:39.655934 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-hgwlz" Dec 13 13:32:39.656282 kubelet[2825]: E1213 13:32:39.656003 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-hgwlz_kube-system(93ef4867-9f5c-40e8-b3d6-6a06506fddf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-hgwlz" podUID="93ef4867-9f5c-40e8-b3d6-6a06506fddf9" Dec 13 13:32:39.706573 systemd[1]: run-netns-cni\x2d29348d01\x2da751\x2d56ff\x2dc3a4\x2d6e61dd9e0da7.mount: Deactivated successfully. Dec 13 13:32:39.706711 systemd[1]: run-netns-cni\x2d896f5638\x2daa3d\x2d4b86\x2d987f\x2d4ff3e0ada0a1.mount: Deactivated successfully. Dec 13 13:32:39.711662 containerd[1480]: time="2024-12-13T13:32:39.711611651Z" level=error msg="Failed to destroy network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.714491 containerd[1480]: time="2024-12-13T13:32:39.714075944Z" level=error msg="encountered an error cleaning up failed sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.715656 containerd[1480]: time="2024-12-13T13:32:39.715602882Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.715823 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889-shm.mount: Deactivated successfully. Dec 13 13:32:39.717696 kubelet[2825]: E1213 13:32:39.717658 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.717822 kubelet[2825]: E1213 13:32:39.717763 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:39.717822 kubelet[2825]: E1213 13:32:39.717801 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" Dec 13 13:32:39.717999 kubelet[2825]: E1213 13:32:39.717884 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7db64dc7d4-49csf_calico-apiserver(587dd10b-ef35-45f6-8cda-437a5ce24419)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podUID="587dd10b-ef35-45f6-8cda-437a5ce24419" Dec 13 13:32:39.727530 containerd[1480]: time="2024-12-13T13:32:39.724757189Z" level=error msg="Failed to destroy network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.729230 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd-shm.mount: Deactivated successfully. Dec 13 13:32:39.730780 containerd[1480]: time="2024-12-13T13:32:39.730455565Z" level=error msg="encountered an error cleaning up failed sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.730894 containerd[1480]: time="2024-12-13T13:32:39.730822778Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.732342 kubelet[2825]: E1213 13:32:39.731478 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.732471 kubelet[2825]: E1213 13:32:39.732392 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:39.732471 kubelet[2825]: E1213 13:32:39.732431 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gvpdf" Dec 13 13:32:39.732553 kubelet[2825]: E1213 13:32:39.732529 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gvpdf_calico-system(a99d7c4c-ae69-4d70-a627-2ff0fceee5d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gvpdf" podUID="a99d7c4c-ae69-4d70-a627-2ff0fceee5d5" Dec 13 13:32:39.762842 containerd[1480]: time="2024-12-13T13:32:39.762787189Z" level=error msg="Failed to destroy network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.765405 containerd[1480]: time="2024-12-13T13:32:39.765361447Z" level=error msg="encountered an error cleaning up failed sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.765712 containerd[1480]: time="2024-12-13T13:32:39.765619737Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.766475 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e-shm.mount: Deactivated successfully. Dec 13 13:32:39.766707 kubelet[2825]: E1213 13:32:39.766686 2825 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 13 13:32:39.766783 kubelet[2825]: E1213 13:32:39.766740 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:39.766783 kubelet[2825]: E1213 13:32:39.766765 2825 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" Dec 13 13:32:39.766858 kubelet[2825]: E1213 13:32:39.766836 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568896bf68-gs24f_calico-system(bcc560db-f238-49e4-9766-c00316d8e479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podUID="bcc560db-f238-49e4-9766-c00316d8e479" Dec 13 13:32:40.003798 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2347337178.mount: Deactivated successfully. Dec 13 13:32:40.046530 containerd[1480]: time="2024-12-13T13:32:40.046273773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:40.047653 containerd[1480]: time="2024-12-13T13:32:40.047535381Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=137671762" Dec 13 13:32:40.048757 containerd[1480]: time="2024-12-13T13:32:40.048683465Z" level=info msg="ImageCreate event name:\"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:40.052326 containerd[1480]: time="2024-12-13T13:32:40.052248921Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:40.052882 containerd[1480]: time="2024-12-13T13:32:40.052842703Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"137671624\" in 6.170565541s" Dec 13 13:32:40.052882 containerd[1480]: time="2024-12-13T13:32:40.052882465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\"" Dec 13 13:32:40.064022 containerd[1480]: time="2024-12-13T13:32:40.063952525Z" level=info msg="CreateContainer within sandbox \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 13 13:32:40.085591 containerd[1480]: time="2024-12-13T13:32:40.085264975Z" level=info msg="CreateContainer within sandbox \"d25085a16741b3b92a40b96c47f1ab1f70da2a39289eaf45938d5c8f0fecd573\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f\"" Dec 13 13:32:40.087735 containerd[1480]: time="2024-12-13T13:32:40.086187370Z" level=info msg="StartContainer for \"df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f\"" Dec 13 13:32:40.121743 systemd[1]: Started cri-containerd-df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f.scope - libcontainer container df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f. Dec 13 13:32:40.168098 containerd[1480]: time="2024-12-13T13:32:40.168005319Z" level=info msg="StartContainer for \"df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f\" returns successfully" Dec 13 13:32:40.299881 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 13 13:32:40.300123 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 13 13:32:40.358530 kubelet[2825]: I1213 13:32:40.358404 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd" Dec 13 13:32:40.360956 containerd[1480]: time="2024-12-13T13:32:40.360529155Z" level=info msg="StopPodSandbox for \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\"" Dec 13 13:32:40.360956 containerd[1480]: time="2024-12-13T13:32:40.360745723Z" level=info msg="Ensure that sandbox d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd in task-service has been cleanup successfully" Dec 13 13:32:40.362440 containerd[1480]: time="2024-12-13T13:32:40.362387025Z" level=info msg="TearDown network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\" successfully" Dec 13 13:32:40.362440 containerd[1480]: time="2024-12-13T13:32:40.362425227Z" level=info msg="StopPodSandbox for \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\" returns successfully" Dec 13 13:32:40.366648 containerd[1480]: time="2024-12-13T13:32:40.364222375Z" level=info msg="StopPodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\"" Dec 13 13:32:40.366648 containerd[1480]: time="2024-12-13T13:32:40.366491621Z" level=info msg="TearDown network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" successfully" Dec 13 13:32:40.366648 containerd[1480]: time="2024-12-13T13:32:40.366535983Z" level=info msg="StopPodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" returns successfully" Dec 13 13:32:40.367215 containerd[1480]: time="2024-12-13T13:32:40.367094644Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" Dec 13 13:32:40.367434 containerd[1480]: time="2024-12-13T13:32:40.367349014Z" level=info msg="TearDown network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" successfully" Dec 13 13:32:40.367434 containerd[1480]: time="2024-12-13T13:32:40.367368534Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" returns successfully" Dec 13 13:32:40.368735 containerd[1480]: time="2024-12-13T13:32:40.368615142Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:40.369362 containerd[1480]: time="2024-12-13T13:32:40.368743587Z" level=info msg="TearDown network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" successfully" Dec 13 13:32:40.369362 containerd[1480]: time="2024-12-13T13:32:40.368755227Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" returns successfully" Dec 13 13:32:40.370485 containerd[1480]: time="2024-12-13T13:32:40.370133680Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:40.370966 containerd[1480]: time="2024-12-13T13:32:40.370592177Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:40.371525 containerd[1480]: time="2024-12-13T13:32:40.370618698Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:40.373481 kubelet[2825]: I1213 13:32:40.372055 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8" Dec 13 13:32:40.373808 containerd[1480]: time="2024-12-13T13:32:40.372659696Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:40.373808 containerd[1480]: time="2024-12-13T13:32:40.372786100Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:40.373808 containerd[1480]: time="2024-12-13T13:32:40.372795981Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:40.373808 containerd[1480]: time="2024-12-13T13:32:40.373454686Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:40.373808 containerd[1480]: time="2024-12-13T13:32:40.373678134Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:40.373808 containerd[1480]: time="2024-12-13T13:32:40.373695735Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:40.373979 containerd[1480]: time="2024-12-13T13:32:40.373812459Z" level=info msg="StopPodSandbox for \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\"" Dec 13 13:32:40.375058 containerd[1480]: time="2024-12-13T13:32:40.374111511Z" level=info msg="Ensure that sandbox bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8 in task-service has been cleanup successfully" Dec 13 13:32:40.375058 containerd[1480]: time="2024-12-13T13:32:40.374710573Z" level=info msg="TearDown network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\" successfully" Dec 13 13:32:40.375058 containerd[1480]: time="2024-12-13T13:32:40.374846299Z" level=info msg="StopPodSandbox for \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\" returns successfully" Dec 13 13:32:40.375871 containerd[1480]: time="2024-12-13T13:32:40.375706891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:7,}" Dec 13 13:32:40.379238 containerd[1480]: time="2024-12-13T13:32:40.378348792Z" level=info msg="StopPodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\"" Dec 13 13:32:40.379238 containerd[1480]: time="2024-12-13T13:32:40.378458396Z" level=info msg="TearDown network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" successfully" Dec 13 13:32:40.379238 containerd[1480]: time="2024-12-13T13:32:40.378467996Z" level=info msg="StopPodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" returns successfully" Dec 13 13:32:40.380278 containerd[1480]: time="2024-12-13T13:32:40.380059537Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" Dec 13 13:32:40.380278 containerd[1480]: time="2024-12-13T13:32:40.380161261Z" level=info msg="TearDown network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" successfully" Dec 13 13:32:40.380278 containerd[1480]: time="2024-12-13T13:32:40.380170741Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" returns successfully" Dec 13 13:32:40.381376 containerd[1480]: time="2024-12-13T13:32:40.381188620Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:40.381376 containerd[1480]: time="2024-12-13T13:32:40.381303584Z" level=info msg="TearDown network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" successfully" Dec 13 13:32:40.381376 containerd[1480]: time="2024-12-13T13:32:40.381314504Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" returns successfully" Dec 13 13:32:40.384606 containerd[1480]: time="2024-12-13T13:32:40.383906563Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:40.385470 containerd[1480]: time="2024-12-13T13:32:40.384863359Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:40.385470 containerd[1480]: time="2024-12-13T13:32:40.384893720Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:40.386993 containerd[1480]: time="2024-12-13T13:32:40.386724510Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:40.389386 containerd[1480]: time="2024-12-13T13:32:40.389235365Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:40.390293 containerd[1480]: time="2024-12-13T13:32:40.389270807Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:40.390360 kubelet[2825]: I1213 13:32:40.390148 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f" Dec 13 13:32:40.393466 containerd[1480]: time="2024-12-13T13:32:40.392624694Z" level=info msg="StopPodSandbox for \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\"" Dec 13 13:32:40.395903 containerd[1480]: time="2024-12-13T13:32:40.395867457Z" level=info msg="Ensure that sandbox 6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f in task-service has been cleanup successfully" Dec 13 13:32:40.397413 containerd[1480]: time="2024-12-13T13:32:40.397293232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:6,}" Dec 13 13:32:40.398828 containerd[1480]: time="2024-12-13T13:32:40.398799649Z" level=info msg="TearDown network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\" successfully" Dec 13 13:32:40.399274 containerd[1480]: time="2024-12-13T13:32:40.399149062Z" level=info msg="StopPodSandbox for \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\" returns successfully" Dec 13 13:32:40.400401 containerd[1480]: time="2024-12-13T13:32:40.400163621Z" level=info msg="StopPodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\"" Dec 13 13:32:40.401050 containerd[1480]: time="2024-12-13T13:32:40.400965011Z" level=info msg="TearDown network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" successfully" Dec 13 13:32:40.401050 containerd[1480]: time="2024-12-13T13:32:40.400985332Z" level=info msg="StopPodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" returns successfully" Dec 13 13:32:40.402239 containerd[1480]: time="2024-12-13T13:32:40.402096254Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" Dec 13 13:32:40.402239 containerd[1480]: time="2024-12-13T13:32:40.402185297Z" level=info msg="TearDown network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" successfully" Dec 13 13:32:40.402239 containerd[1480]: time="2024-12-13T13:32:40.402195338Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" returns successfully" Dec 13 13:32:40.403854 containerd[1480]: time="2024-12-13T13:32:40.403620512Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:40.404115 containerd[1480]: time="2024-12-13T13:32:40.403824440Z" level=info msg="TearDown network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" successfully" Dec 13 13:32:40.404731 containerd[1480]: time="2024-12-13T13:32:40.404249096Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" returns successfully" Dec 13 13:32:40.406196 containerd[1480]: time="2024-12-13T13:32:40.405277535Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:40.406196 containerd[1480]: time="2024-12-13T13:32:40.406092246Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:40.406196 containerd[1480]: time="2024-12-13T13:32:40.406108126Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:40.407389 containerd[1480]: time="2024-12-13T13:32:40.406753031Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:40.407389 containerd[1480]: time="2024-12-13T13:32:40.406842274Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:40.407389 containerd[1480]: time="2024-12-13T13:32:40.406853555Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:40.408946 kubelet[2825]: I1213 13:32:40.408004 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889" Dec 13 13:32:40.409888 containerd[1480]: time="2024-12-13T13:32:40.409767986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:6,}" Dec 13 13:32:40.410062 containerd[1480]: time="2024-12-13T13:32:40.410030516Z" level=info msg="StopPodSandbox for \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\"" Dec 13 13:32:40.410229 containerd[1480]: time="2024-12-13T13:32:40.410195802Z" level=info msg="Ensure that sandbox d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889 in task-service has been cleanup successfully" Dec 13 13:32:40.411740 containerd[1480]: time="2024-12-13T13:32:40.411554613Z" level=info msg="TearDown network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\" successfully" Dec 13 13:32:40.411740 containerd[1480]: time="2024-12-13T13:32:40.411585815Z" level=info msg="StopPodSandbox for \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\" returns successfully" Dec 13 13:32:40.413600 containerd[1480]: time="2024-12-13T13:32:40.413557930Z" level=info msg="StopPodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\"" Dec 13 13:32:40.414238 containerd[1480]: time="2024-12-13T13:32:40.414039228Z" level=info msg="TearDown network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" successfully" Dec 13 13:32:40.414566 containerd[1480]: time="2024-12-13T13:32:40.414447083Z" level=info msg="StopPodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" returns successfully" Dec 13 13:32:40.416716 containerd[1480]: time="2024-12-13T13:32:40.416559764Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" Dec 13 13:32:40.416716 containerd[1480]: time="2024-12-13T13:32:40.416682768Z" level=info msg="TearDown network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" successfully" Dec 13 13:32:40.416716 containerd[1480]: time="2024-12-13T13:32:40.416693369Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" returns successfully" Dec 13 13:32:40.418672 containerd[1480]: time="2024-12-13T13:32:40.418241708Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:40.419304 containerd[1480]: time="2024-12-13T13:32:40.419096020Z" level=info msg="TearDown network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" successfully" Dec 13 13:32:40.419304 containerd[1480]: time="2024-12-13T13:32:40.419165943Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" returns successfully" Dec 13 13:32:40.422059 containerd[1480]: time="2024-12-13T13:32:40.421705999Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:40.422059 containerd[1480]: time="2024-12-13T13:32:40.421811723Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:40.422059 containerd[1480]: time="2024-12-13T13:32:40.421823484Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:40.423074 containerd[1480]: time="2024-12-13T13:32:40.422896964Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:40.423074 containerd[1480]: time="2024-12-13T13:32:40.423019009Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:40.423074 containerd[1480]: time="2024-12-13T13:32:40.423030410Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:40.425812 containerd[1480]: time="2024-12-13T13:32:40.425750313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:6,}" Dec 13 13:32:40.426037 kubelet[2825]: I1213 13:32:40.425768 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e" Dec 13 13:32:40.430803 containerd[1480]: time="2024-12-13T13:32:40.430654779Z" level=info msg="StopPodSandbox for \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\"" Dec 13 13:32:40.432169 containerd[1480]: time="2024-12-13T13:32:40.432043432Z" level=info msg="Ensure that sandbox b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e in task-service has been cleanup successfully" Dec 13 13:32:40.432804 containerd[1480]: time="2024-12-13T13:32:40.432648135Z" level=info msg="TearDown network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\" successfully" Dec 13 13:32:40.432804 containerd[1480]: time="2024-12-13T13:32:40.432679976Z" level=info msg="StopPodSandbox for \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\" returns successfully" Dec 13 13:32:40.434146 containerd[1480]: time="2024-12-13T13:32:40.434115951Z" level=info msg="StopPodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\"" Dec 13 13:32:40.434591 containerd[1480]: time="2024-12-13T13:32:40.434457524Z" level=info msg="TearDown network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" successfully" Dec 13 13:32:40.434591 containerd[1480]: time="2024-12-13T13:32:40.434478204Z" level=info msg="StopPodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" returns successfully" Dec 13 13:32:40.438696 containerd[1480]: time="2024-12-13T13:32:40.438625122Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" Dec 13 13:32:40.440375 containerd[1480]: time="2024-12-13T13:32:40.439641481Z" level=info msg="TearDown network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" successfully" Dec 13 13:32:40.440375 containerd[1480]: time="2024-12-13T13:32:40.439665322Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" returns successfully" Dec 13 13:32:40.440375 containerd[1480]: time="2024-12-13T13:32:40.440009495Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:40.440375 containerd[1480]: time="2024-12-13T13:32:40.440086538Z" level=info msg="TearDown network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" successfully" Dec 13 13:32:40.440375 containerd[1480]: time="2024-12-13T13:32:40.440096218Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" returns successfully" Dec 13 13:32:40.441531 containerd[1480]: time="2024-12-13T13:32:40.441396347Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:40.442140 containerd[1480]: time="2024-12-13T13:32:40.442065013Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:40.443478 containerd[1480]: time="2024-12-13T13:32:40.443450425Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:40.445398 containerd[1480]: time="2024-12-13T13:32:40.445343297Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:40.446044 containerd[1480]: time="2024-12-13T13:32:40.446018523Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:40.446302 containerd[1480]: time="2024-12-13T13:32:40.446141568Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:40.448734 containerd[1480]: time="2024-12-13T13:32:40.447888274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:6,}" Dec 13 13:32:40.451100 kubelet[2825]: I1213 13:32:40.451037 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b" Dec 13 13:32:40.456030 containerd[1480]: time="2024-12-13T13:32:40.455276995Z" level=info msg="StopPodSandbox for \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\"" Dec 13 13:32:40.456030 containerd[1480]: time="2024-12-13T13:32:40.455460642Z" level=info msg="Ensure that sandbox 8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b in task-service has been cleanup successfully" Dec 13 13:32:40.456533 containerd[1480]: time="2024-12-13T13:32:40.456267712Z" level=info msg="TearDown network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\" successfully" Dec 13 13:32:40.456620 containerd[1480]: time="2024-12-13T13:32:40.456603685Z" level=info msg="StopPodSandbox for \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\" returns successfully" Dec 13 13:32:40.459630 containerd[1480]: time="2024-12-13T13:32:40.459577558Z" level=info msg="StopPodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\"" Dec 13 13:32:40.460503 containerd[1480]: time="2024-12-13T13:32:40.460365428Z" level=info msg="TearDown network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" successfully" Dec 13 13:32:40.460503 containerd[1480]: time="2024-12-13T13:32:40.460393429Z" level=info msg="StopPodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" returns successfully" Dec 13 13:32:40.462330 containerd[1480]: time="2024-12-13T13:32:40.462207818Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" Dec 13 13:32:40.462330 containerd[1480]: time="2024-12-13T13:32:40.462328303Z" level=info msg="TearDown network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" successfully" Dec 13 13:32:40.462330 containerd[1480]: time="2024-12-13T13:32:40.462338423Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" returns successfully" Dec 13 13:32:40.465910 containerd[1480]: time="2024-12-13T13:32:40.465857637Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:40.466141 containerd[1480]: time="2024-12-13T13:32:40.466036124Z" level=info msg="TearDown network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" successfully" Dec 13 13:32:40.466141 containerd[1480]: time="2024-12-13T13:32:40.466048364Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" returns successfully" Dec 13 13:32:40.467281 containerd[1480]: time="2024-12-13T13:32:40.467185567Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:40.467624 containerd[1480]: time="2024-12-13T13:32:40.467295171Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:40.467624 containerd[1480]: time="2024-12-13T13:32:40.467305492Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:40.470529 containerd[1480]: time="2024-12-13T13:32:40.469675742Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:40.470529 containerd[1480]: time="2024-12-13T13:32:40.469822308Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:40.470529 containerd[1480]: time="2024-12-13T13:32:40.469833908Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:40.475269 containerd[1480]: time="2024-12-13T13:32:40.474996384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:6,}" Dec 13 13:32:40.530239 kubelet[2825]: I1213 13:32:40.529846 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-rsmpl" podStartSLOduration=1.678337851 podStartE2EDuration="17.529785906s" podCreationTimestamp="2024-12-13 13:32:23 +0000 UTC" firstStartedPulling="2024-12-13 13:32:24.202141956 +0000 UTC m=+24.679449178" lastFinishedPulling="2024-12-13 13:32:40.053589971 +0000 UTC m=+40.530897233" observedRunningTime="2024-12-13 13:32:40.529184283 +0000 UTC m=+41.006491545" watchObservedRunningTime="2024-12-13 13:32:40.529785906 +0000 UTC m=+41.007093168" Dec 13 13:32:40.714829 systemd[1]: run-netns-cni\x2d59e5d09a\x2d147b\x2dc683\x2dd48d\x2d1bbb9646d3f4.mount: Deactivated successfully. Dec 13 13:32:40.714918 systemd[1]: run-netns-cni\x2d8136eecb\x2dda64\x2dbf44\x2df08f\x2deeb05ddbc0db.mount: Deactivated successfully. Dec 13 13:32:40.714988 systemd[1]: run-netns-cni\x2d887ba587\x2da10c\x2d8647\x2d076b\x2dab9d2d8c1bd4.mount: Deactivated successfully. Dec 13 13:32:40.715035 systemd[1]: run-netns-cni\x2d2d680932\x2d36f0\x2d69b0\x2d4a02\x2df2fc7b24b1c8.mount: Deactivated successfully. Dec 13 13:32:40.715081 systemd[1]: run-netns-cni\x2d76b072dd\x2ddcba\x2dca99\x2d7f95\x2d05b6caf39ce4.mount: Deactivated successfully. Dec 13 13:32:40.715122 systemd[1]: run-netns-cni\x2d36942dc1\x2dd434\x2db8ae\x2de468\x2d0c6aac08c81b.mount: Deactivated successfully. Dec 13 13:32:41.186207 systemd-networkd[1374]: calid10df53f69e: Link UP Dec 13 13:32:41.186380 systemd-networkd[1374]: calid10df53f69e: Gained carrier Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:40.654 [INFO][4826] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:40.752 [INFO][4826] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0 calico-apiserver-7db64dc7d4- calico-apiserver 587dd10b-ef35-45f6-8cda-437a5ce24419 720 0 2024-12-13 13:32:22 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7db64dc7d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4186-0-0-4-8ed7fad560 calico-apiserver-7db64dc7d4-49csf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid10df53f69e [] []}} ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:40.752 [INFO][4826] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:40.996 [INFO][4874] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" HandleID="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.039 [INFO][4874] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" HandleID="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c670), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4186-0-0-4-8ed7fad560", "pod":"calico-apiserver-7db64dc7d4-49csf", "timestamp":"2024-12-13 13:32:40.99596406 +0000 UTC"}, Hostname:"ci-4186-0-0-4-8ed7fad560", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.040 [INFO][4874] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.040 [INFO][4874] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.040 [INFO][4874] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-0-0-4-8ed7fad560' Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.056 [INFO][4874] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.068 [INFO][4874] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.095 [INFO][4874] ipam/ipam.go 521: Ran out of existing affine blocks for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.102 [INFO][4874] ipam/ipam.go 538: Tried all affine blocks. Looking for an affine block with space, or a new unclaimed block host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.105 [INFO][4874] ipam/ipam_block_reader_writer.go 154: Found free block: 192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.105 [INFO][4874] ipam/ipam.go 550: Found unclaimed block host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.105 [INFO][4874] ipam/ipam_block_reader_writer.go 171: Trying to create affinity in pending state host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.109 [INFO][4874] ipam/ipam_block_reader_writer.go 201: Successfully created pending affinity for block host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.109 [INFO][4874] ipam/ipam.go 155: Attempting to load block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.112 [INFO][4874] ipam/ipam.go 160: The referenced block doesn't exist, trying to create it cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.119 [INFO][4874] ipam/ipam.go 167: Wrote affinity as pending cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.123 [INFO][4874] ipam/ipam.go 176: Attempting to claim the block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.123 [INFO][4874] ipam/ipam_block_reader_writer.go 223: Attempting to create a new block host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.132 [INFO][4874] ipam/ipam_block_reader_writer.go 264: Successfully created block Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.133 [INFO][4874] ipam/ipam_block_reader_writer.go 275: Confirming affinity host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.142 [INFO][4874] ipam/ipam_block_reader_writer.go 290: Successfully confirmed affinity host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.142 [INFO][4874] ipam/ipam.go 585: Block '192.168.71.192/26' has 64 free ips which is more than 1 ips required. host="ci-4186-0-0-4-8ed7fad560" subnet=192.168.71.192/26 Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.142 [INFO][4874] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.71.192/26 handle="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.211439 containerd[1480]: 2024-12-13 13:32:41.146 [INFO][4874] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260 Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.153 [INFO][4874] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.71.192/26 handle="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.166 [INFO][4874] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.71.192/26] block=192.168.71.192/26 handle="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.166 [INFO][4874] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.71.192/26] handle="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.166 [INFO][4874] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.166 [INFO][4874] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.192/26] IPv6=[] ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" HandleID="k8s-pod-network.b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.171 [INFO][4826] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0", GenerateName:"calico-apiserver-7db64dc7d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"587dd10b-ef35-45f6-8cda-437a5ce24419", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db64dc7d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"", Pod:"calico-apiserver-7db64dc7d4-49csf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.192/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid10df53f69e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.172 [INFO][4826] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.71.192/32] ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.172 [INFO][4826] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid10df53f69e ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.185 [INFO][4826] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.213182 containerd[1480]: 2024-12-13 13:32:41.186 [INFO][4826] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0", GenerateName:"calico-apiserver-7db64dc7d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"587dd10b-ef35-45f6-8cda-437a5ce24419", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db64dc7d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260", Pod:"calico-apiserver-7db64dc7d4-49csf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.192/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid10df53f69e", MAC:"8e:90:ba:c7:5a:40", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.213446 containerd[1480]: 2024-12-13 13:32:41.208 [INFO][4826] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-49csf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--49csf-eth0" Dec 13 13:32:41.251891 systemd-networkd[1374]: cali130ddcaa285: Link UP Dec 13 13:32:41.252148 systemd-networkd[1374]: cali130ddcaa285: Gained carrier Dec 13 13:32:41.263210 containerd[1480]: time="2024-12-13T13:32:41.261839792Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:41.263210 containerd[1480]: time="2024-12-13T13:32:41.261920595Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:41.263210 containerd[1480]: time="2024-12-13T13:32:41.262330451Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.263210 containerd[1480]: time="2024-12-13T13:32:41.262469216Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:40.634 [INFO][4782] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:40.762 [INFO][4782] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0 csi-node-driver- calico-system a99d7c4c-ae69-4d70-a627-2ff0fceee5d5 632 0 2024-12-13 13:32:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b695c467 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4186-0-0-4-8ed7fad560 csi-node-driver-gvpdf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali130ddcaa285 [] []}} ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:40.762 [INFO][4782] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.000 [INFO][4883] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" HandleID="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Workload="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.055 [INFO][4883] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" HandleID="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Workload="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003f1780), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4186-0-0-4-8ed7fad560", "pod":"csi-node-driver-gvpdf", "timestamp":"2024-12-13 13:32:41.000678199 +0000 UTC"}, Hostname:"ci-4186-0-0-4-8ed7fad560", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.055 [INFO][4883] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.167 [INFO][4883] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.168 [INFO][4883] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-0-0-4-8ed7fad560' Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.174 [INFO][4883] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.180 [INFO][4883] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.192 [INFO][4883] ipam/ipam.go 489: Trying affinity for 192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.196 [INFO][4883] ipam/ipam.go 155: Attempting to load block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.207 [INFO][4883] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.207 [INFO][4883] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.71.192/26 handle="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.214 [INFO][4883] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427 Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.224 [INFO][4883] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.71.192/26 handle="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.236 [INFO][4883] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.71.193/26] block=192.168.71.192/26 handle="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.236 [INFO][4883] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.71.193/26] handle="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.236 [INFO][4883] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Dec 13 13:32:41.288641 containerd[1480]: 2024-12-13 13:32:41.236 [INFO][4883] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.193/26] IPv6=[] ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" HandleID="k8s-pod-network.d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Workload="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.290444 containerd[1480]: 2024-12-13 13:32:41.245 [INFO][4782] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5", ResourceVersion:"632", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"", Pod:"csi-node-driver-gvpdf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali130ddcaa285", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.290444 containerd[1480]: 2024-12-13 13:32:41.246 [INFO][4782] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.71.193/32] ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.290444 containerd[1480]: 2024-12-13 13:32:41.246 [INFO][4782] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali130ddcaa285 ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.290444 containerd[1480]: 2024-12-13 13:32:41.251 [INFO][4782] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.290444 containerd[1480]: 2024-12-13 13:32:41.251 [INFO][4782] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a99d7c4c-ae69-4d70-a627-2ff0fceee5d5", ResourceVersion:"632", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427", Pod:"csi-node-driver-gvpdf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali130ddcaa285", MAC:"6a:2e:c8:87:bc:cc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.290444 containerd[1480]: 2024-12-13 13:32:41.283 [INFO][4782] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427" Namespace="calico-system" Pod="csi-node-driver-gvpdf" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-csi--node--driver--gvpdf-eth0" Dec 13 13:32:41.324439 systemd[1]: Started cri-containerd-b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260.scope - libcontainer container b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260. Dec 13 13:32:41.369054 systemd-networkd[1374]: caliebc12aa71a1: Link UP Dec 13 13:32:41.379644 containerd[1480]: time="2024-12-13T13:32:41.379217105Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:41.379644 containerd[1480]: time="2024-12-13T13:32:41.379284468Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:41.379644 containerd[1480]: time="2024-12-13T13:32:41.379300429Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.379644 containerd[1480]: time="2024-12-13T13:32:41.379387992Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.379471 systemd-networkd[1374]: caliebc12aa71a1: Gained carrier Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:40.820 [INFO][4850] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:40.847 [INFO][4850] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0 coredns-76f75df574- kube-system 8d7d06e0-d386-4db3-9635-acc914ab1f58 722 0 2024-12-13 13:32:15 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4186-0-0-4-8ed7fad560 coredns-76f75df574-xdxpd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliebc12aa71a1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:40.847 [INFO][4850] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.014 [INFO][4892] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" HandleID="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Workload="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.054 [INFO][4892] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" HandleID="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Workload="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003af520), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4186-0-0-4-8ed7fad560", "pod":"coredns-76f75df574-xdxpd", "timestamp":"2024-12-13 13:32:41.014274357 +0000 UTC"}, Hostname:"ci-4186-0-0-4-8ed7fad560", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.055 [INFO][4892] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.236 [INFO][4892] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.236 [INFO][4892] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-0-0-4-8ed7fad560' Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.242 [INFO][4892] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.260 [INFO][4892] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.292 [INFO][4892] ipam/ipam.go 489: Trying affinity for 192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.300 [INFO][4892] ipam/ipam.go 155: Attempting to load block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.315 [INFO][4892] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.315 [INFO][4892] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.71.192/26 handle="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.328 [INFO][4892] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9 Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.337 [INFO][4892] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.71.192/26 handle="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.350 [INFO][4892] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.71.195/26] block=192.168.71.192/26 handle="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.350 [INFO][4892] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.71.195/26] handle="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.350 [INFO][4892] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Dec 13 13:32:41.413579 containerd[1480]: 2024-12-13 13:32:41.350 [INFO][4892] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.195/26] IPv6=[] ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" HandleID="k8s-pod-network.148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Workload="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.415191 containerd[1480]: 2024-12-13 13:32:41.362 [INFO][4850] cni-plugin/k8s.go 386: Populated endpoint ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"8d7d06e0-d386-4db3-9635-acc914ab1f58", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"", Pod:"coredns-76f75df574-xdxpd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliebc12aa71a1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.415191 containerd[1480]: 2024-12-13 13:32:41.362 [INFO][4850] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.71.195/32] ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.415191 containerd[1480]: 2024-12-13 13:32:41.362 [INFO][4850] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliebc12aa71a1 ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.415191 containerd[1480]: 2024-12-13 13:32:41.386 [INFO][4850] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.415191 containerd[1480]: 2024-12-13 13:32:41.387 [INFO][4850] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"8d7d06e0-d386-4db3-9635-acc914ab1f58", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9", Pod:"coredns-76f75df574-xdxpd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliebc12aa71a1", MAC:"e2:58:b7:cc:3c:d5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.415191 containerd[1480]: 2024-12-13 13:32:41.408 [INFO][4850] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9" Namespace="kube-system" Pod="coredns-76f75df574-xdxpd" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--xdxpd-eth0" Dec 13 13:32:41.442171 systemd[1]: Started cri-containerd-d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427.scope - libcontainer container d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427. Dec 13 13:32:41.488831 systemd-networkd[1374]: calib07af2374c9: Link UP Dec 13 13:32:41.489170 systemd-networkd[1374]: calib07af2374c9: Gained carrier Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:40.581 [INFO][4796] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:40.668 [INFO][4796] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0 coredns-76f75df574- kube-system 93ef4867-9f5c-40e8-b3d6-6a06506fddf9 715 0 2024-12-13 13:32:15 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4186-0-0-4-8ed7fad560 coredns-76f75df574-hgwlz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib07af2374c9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:40.668 [INFO][4796] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.010 [INFO][4867] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" HandleID="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Workload="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.056 [INFO][4867] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" HandleID="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Workload="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cf4c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4186-0-0-4-8ed7fad560", "pod":"coredns-76f75df574-hgwlz", "timestamp":"2024-12-13 13:32:41.009854629 +0000 UTC"}, Hostname:"ci-4186-0-0-4-8ed7fad560", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.057 [INFO][4867] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.351 [INFO][4867] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.351 [INFO][4867] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-0-0-4-8ed7fad560' Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.355 [INFO][4867] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.370 [INFO][4867] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.404 [INFO][4867] ipam/ipam.go 489: Trying affinity for 192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.413 [INFO][4867] ipam/ipam.go 155: Attempting to load block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.421 [INFO][4867] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.422 [INFO][4867] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.71.192/26 handle="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.426 [INFO][4867] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51 Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.435 [INFO][4867] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.71.192/26 handle="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.451 [INFO][4867] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.71.196/26] block=192.168.71.192/26 handle="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.451 [INFO][4867] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.71.196/26] handle="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.451 [INFO][4867] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Dec 13 13:32:41.536209 containerd[1480]: 2024-12-13 13:32:41.451 [INFO][4867] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.196/26] IPv6=[] ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" HandleID="k8s-pod-network.02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Workload="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.537534 containerd[1480]: 2024-12-13 13:32:41.457 [INFO][4796] cni-plugin/k8s.go 386: Populated endpoint ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"93ef4867-9f5c-40e8-b3d6-6a06506fddf9", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"", Pod:"coredns-76f75df574-hgwlz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib07af2374c9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.537534 containerd[1480]: 2024-12-13 13:32:41.458 [INFO][4796] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.71.196/32] ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.537534 containerd[1480]: 2024-12-13 13:32:41.458 [INFO][4796] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib07af2374c9 ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.537534 containerd[1480]: 2024-12-13 13:32:41.488 [INFO][4796] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.537534 containerd[1480]: 2024-12-13 13:32:41.494 [INFO][4796] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"93ef4867-9f5c-40e8-b3d6-6a06506fddf9", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51", Pod:"coredns-76f75df574-hgwlz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib07af2374c9", MAC:"8e:ee:2f:ab:23:a3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.537534 containerd[1480]: 2024-12-13 13:32:41.528 [INFO][4796] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51" Namespace="kube-system" Pod="coredns-76f75df574-hgwlz" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-coredns--76f75df574--hgwlz-eth0" Dec 13 13:32:41.539049 containerd[1480]: time="2024-12-13T13:32:41.536351934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-49csf,Uid:587dd10b-ef35-45f6-8cda-437a5ce24419,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260\"" Dec 13 13:32:41.543163 containerd[1480]: time="2024-12-13T13:32:41.542091073Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:41.543163 containerd[1480]: time="2024-12-13T13:32:41.542156675Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:41.543163 containerd[1480]: time="2024-12-13T13:32:41.542171956Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.543163 containerd[1480]: time="2024-12-13T13:32:41.542256719Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.552136 containerd[1480]: time="2024-12-13T13:32:41.552093214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Dec 13 13:32:41.589874 systemd[1]: Started cri-containerd-148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9.scope - libcontainer container 148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9. Dec 13 13:32:41.612317 systemd-networkd[1374]: calib8b2985f2e4: Link UP Dec 13 13:32:41.613636 systemd-networkd[1374]: calib8b2985f2e4: Gained carrier Dec 13 13:32:41.624121 containerd[1480]: time="2024-12-13T13:32:41.623989754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gvpdf,Uid:a99d7c4c-ae69-4d70-a627-2ff0fceee5d5,Namespace:calico-system,Attempt:7,} returns sandbox id \"d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427\"" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:40.769 [INFO][4811] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:40.847 [INFO][4811] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0 calico-apiserver-7db64dc7d4- calico-apiserver 74a12de7-657f-4aae-821f-e260248f542a 723 0 2024-12-13 13:32:22 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7db64dc7d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4186-0-0-4-8ed7fad560 calico-apiserver-7db64dc7d4-8w5p4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib8b2985f2e4 [] []}} ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:40.847 [INFO][4811] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.036 [INFO][4891] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" HandleID="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.064 [INFO][4891] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" HandleID="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000325f70), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4186-0-0-4-8ed7fad560", "pod":"calico-apiserver-7db64dc7d4-8w5p4", "timestamp":"2024-12-13 13:32:41.036448602 +0000 UTC"}, Hostname:"ci-4186-0-0-4-8ed7fad560", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.064 [INFO][4891] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.452 [INFO][4891] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.452 [INFO][4891] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-0-0-4-8ed7fad560' Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.457 [INFO][4891] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.469 [INFO][4891] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.495 [INFO][4891] ipam/ipam.go 489: Trying affinity for 192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.508 [INFO][4891] ipam/ipam.go 155: Attempting to load block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.544 [INFO][4891] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.545 [INFO][4891] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.71.192/26 handle="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.552 [INFO][4891] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.563 [INFO][4891] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.71.192/26 handle="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.578 [INFO][4891] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.71.197/26] block=192.168.71.192/26 handle="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.578 [INFO][4891] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.71.197/26] handle="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.579 [INFO][4891] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Dec 13 13:32:41.683096 containerd[1480]: 2024-12-13 13:32:41.579 [INFO][4891] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.197/26] IPv6=[] ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" HandleID="k8s-pod-network.4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.684189 containerd[1480]: 2024-12-13 13:32:41.603 [INFO][4811] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0", GenerateName:"calico-apiserver-7db64dc7d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"74a12de7-657f-4aae-821f-e260248f542a", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db64dc7d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"", Pod:"calico-apiserver-7db64dc7d4-8w5p4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib8b2985f2e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.684189 containerd[1480]: 2024-12-13 13:32:41.603 [INFO][4811] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.71.197/32] ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.684189 containerd[1480]: 2024-12-13 13:32:41.603 [INFO][4811] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib8b2985f2e4 ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.684189 containerd[1480]: 2024-12-13 13:32:41.615 [INFO][4811] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.684189 containerd[1480]: 2024-12-13 13:32:41.631 [INFO][4811] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0", GenerateName:"calico-apiserver-7db64dc7d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"74a12de7-657f-4aae-821f-e260248f542a", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7db64dc7d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f", Pod:"calico-apiserver-7db64dc7d4-8w5p4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib8b2985f2e4", MAC:"22:0e:b1:0f:cb:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.684189 containerd[1480]: 2024-12-13 13:32:41.672 [INFO][4811] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f" Namespace="calico-apiserver" Pod="calico-apiserver-7db64dc7d4-8w5p4" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--apiserver--7db64dc7d4--8w5p4-eth0" Dec 13 13:32:41.688512 containerd[1480]: time="2024-12-13T13:32:41.685937515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-xdxpd,Uid:8d7d06e0-d386-4db3-9635-acc914ab1f58,Namespace:kube-system,Attempt:6,} returns sandbox id \"148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9\"" Dec 13 13:32:41.694738 containerd[1480]: time="2024-12-13T13:32:41.692887260Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:41.694738 containerd[1480]: time="2024-12-13T13:32:41.693182511Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:41.694738 containerd[1480]: time="2024-12-13T13:32:41.693199472Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.694738 containerd[1480]: time="2024-12-13T13:32:41.693623848Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.714444 containerd[1480]: time="2024-12-13T13:32:41.714181111Z" level=info msg="CreateContainer within sandbox \"148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 13 13:32:41.744298 systemd[1]: Started cri-containerd-02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51.scope - libcontainer container 02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51. Dec 13 13:32:41.752992 systemd-networkd[1374]: cali9b79365c4e4: Link UP Dec 13 13:32:41.755577 systemd-networkd[1374]: cali9b79365c4e4: Gained carrier Dec 13 13:32:41.775782 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3320044128.mount: Deactivated successfully. Dec 13 13:32:41.796626 containerd[1480]: time="2024-12-13T13:32:41.796480008Z" level=info msg="CreateContainer within sandbox \"148e2b747d97820662ba0c732daec81f53035f435dbdc94d4b7c595b11cab8b9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ce1754196bdb6e34d6ee47262e3f9fb56b24fb9e1a378e61d154ca8fdbc3e379\"" Dec 13 13:32:41.798834 containerd[1480]: time="2024-12-13T13:32:41.798299197Z" level=info msg="StartContainer for \"ce1754196bdb6e34d6ee47262e3f9fb56b24fb9e1a378e61d154ca8fdbc3e379\"" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:40.784 [INFO][4839] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:40.853 [INFO][4839] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0 calico-kube-controllers-568896bf68- calico-system bcc560db-f238-49e4-9766-c00316d8e479 721 0 2024-12-13 13:32:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:568896bf68 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4186-0-0-4-8ed7fad560 calico-kube-controllers-568896bf68-gs24f eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9b79365c4e4 [] []}} ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:40.854 [INFO][4839] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.037 [INFO][4893] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" HandleID="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.066 [INFO][4893] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" HandleID="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400039b200), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4186-0-0-4-8ed7fad560", "pod":"calico-kube-controllers-568896bf68-gs24f", "timestamp":"2024-12-13 13:32:41.037723771 +0000 UTC"}, Hostname:"ci-4186-0-0-4-8ed7fad560", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.066 [INFO][4893] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.579 [INFO][4893] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.583 [INFO][4893] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4186-0-0-4-8ed7fad560' Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.587 [INFO][4893] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.609 [INFO][4893] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.631 [INFO][4893] ipam/ipam.go 489: Trying affinity for 192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.645 [INFO][4893] ipam/ipam.go 155: Attempting to load block cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.658 [INFO][4893] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.71.192/26 host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.658 [INFO][4893] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.71.192/26 handle="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.665 [INFO][4893] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95 Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.687 [INFO][4893] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.71.192/26 handle="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.719 [INFO][4893] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.71.198/26] block=192.168.71.192/26 handle="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.719 [INFO][4893] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.71.198/26] handle="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" host="ci-4186-0-0-4-8ed7fad560" Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.719 [INFO][4893] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Dec 13 13:32:41.802970 containerd[1480]: 2024-12-13 13:32:41.719 [INFO][4893] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.198/26] IPv6=[] ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" HandleID="k8s-pod-network.f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Workload="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.806406 containerd[1480]: 2024-12-13 13:32:41.740 [INFO][4839] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0", GenerateName:"calico-kube-controllers-568896bf68-", Namespace:"calico-system", SelfLink:"", UID:"bcc560db-f238-49e4-9766-c00316d8e479", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"568896bf68", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"", Pod:"calico-kube-controllers-568896bf68-gs24f", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9b79365c4e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.806406 containerd[1480]: 2024-12-13 13:32:41.740 [INFO][4839] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.71.198/32] ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.806406 containerd[1480]: 2024-12-13 13:32:41.740 [INFO][4839] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9b79365c4e4 ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.806406 containerd[1480]: 2024-12-13 13:32:41.757 [INFO][4839] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.806406 containerd[1480]: 2024-12-13 13:32:41.758 [INFO][4839] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0", GenerateName:"calico-kube-controllers-568896bf68-", Namespace:"calico-system", SelfLink:"", UID:"bcc560db-f238-49e4-9766-c00316d8e479", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2024, time.December, 13, 13, 32, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"568896bf68", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4186-0-0-4-8ed7fad560", ContainerID:"f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95", Pod:"calico-kube-controllers-568896bf68-gs24f", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9b79365c4e4", MAC:"7a:35:86:41:a4:57", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Dec 13 13:32:41.806406 containerd[1480]: 2024-12-13 13:32:41.797 [INFO][4839] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95" Namespace="calico-system" Pod="calico-kube-controllers-568896bf68-gs24f" WorkloadEndpoint="ci--4186--0--0--4--8ed7fad560-k8s-calico--kube--controllers--568896bf68--gs24f-eth0" Dec 13 13:32:41.826282 containerd[1480]: time="2024-12-13T13:32:41.825786245Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:41.827268 containerd[1480]: time="2024-12-13T13:32:41.827145136Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:41.828556 containerd[1480]: time="2024-12-13T13:32:41.827207819Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.830665 containerd[1480]: time="2024-12-13T13:32:41.829665112Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.864818 containerd[1480]: time="2024-12-13T13:32:41.864766250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-hgwlz,Uid:93ef4867-9f5c-40e8-b3d6-6a06506fddf9,Namespace:kube-system,Attempt:6,} returns sandbox id \"02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51\"" Dec 13 13:32:41.876267 containerd[1480]: time="2024-12-13T13:32:41.876217367Z" level=info msg="CreateContainer within sandbox \"02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 13 13:32:41.879827 systemd[1]: Started cri-containerd-4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f.scope - libcontainer container 4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f. Dec 13 13:32:41.889469 containerd[1480]: time="2024-12-13T13:32:41.888729883Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Dec 13 13:32:41.889469 containerd[1480]: time="2024-12-13T13:32:41.888799446Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Dec 13 13:32:41.889469 containerd[1480]: time="2024-12-13T13:32:41.888814727Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.889469 containerd[1480]: time="2024-12-13T13:32:41.888900850Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Dec 13 13:32:41.892834 systemd[1]: Started cri-containerd-ce1754196bdb6e34d6ee47262e3f9fb56b24fb9e1a378e61d154ca8fdbc3e379.scope - libcontainer container ce1754196bdb6e34d6ee47262e3f9fb56b24fb9e1a378e61d154ca8fdbc3e379. Dec 13 13:32:41.908841 containerd[1480]: time="2024-12-13T13:32:41.908665883Z" level=info msg="CreateContainer within sandbox \"02dd29c687217ba3dbb0557056f6fcf61688e5229989a1dcd7b1e57a1d85ac51\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6dd687fe7b78d1074c049f8679de41ab1221f80078bc4720b0463ab47e8c5d0e\"" Dec 13 13:32:41.914536 containerd[1480]: time="2024-12-13T13:32:41.913484067Z" level=info msg="StartContainer for \"6dd687fe7b78d1074c049f8679de41ab1221f80078bc4720b0463ab47e8c5d0e\"" Dec 13 13:32:41.938797 systemd[1]: Started cri-containerd-f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95.scope - libcontainer container f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95. Dec 13 13:32:41.978865 containerd[1480]: time="2024-12-13T13:32:41.978400581Z" level=info msg="StartContainer for \"ce1754196bdb6e34d6ee47262e3f9fb56b24fb9e1a378e61d154ca8fdbc3e379\" returns successfully" Dec 13 13:32:42.008917 systemd[1]: Started cri-containerd-6dd687fe7b78d1074c049f8679de41ab1221f80078bc4720b0463ab47e8c5d0e.scope - libcontainer container 6dd687fe7b78d1074c049f8679de41ab1221f80078bc4720b0463ab47e8c5d0e. Dec 13 13:32:42.021105 containerd[1480]: time="2024-12-13T13:32:42.021037208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7db64dc7d4-8w5p4,Uid:74a12de7-657f-4aae-821f-e260248f542a,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f\"" Dec 13 13:32:42.119414 containerd[1480]: time="2024-12-13T13:32:42.118019315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568896bf68-gs24f,Uid:bcc560db-f238-49e4-9766-c00316d8e479,Namespace:calico-system,Attempt:6,} returns sandbox id \"f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95\"" Dec 13 13:32:42.124879 containerd[1480]: time="2024-12-13T13:32:42.124407159Z" level=info msg="StartContainer for \"6dd687fe7b78d1074c049f8679de41ab1221f80078bc4720b0463ab47e8c5d0e\" returns successfully" Dec 13 13:32:42.517364 kubelet[2825]: I1213 13:32:42.517211 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-xdxpd" podStartSLOduration=27.516885559 podStartE2EDuration="27.516885559s" podCreationTimestamp="2024-12-13 13:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-12-13 13:32:42.515929963 +0000 UTC m=+42.993237225" watchObservedRunningTime="2024-12-13 13:32:42.516885559 +0000 UTC m=+42.994193261" Dec 13 13:32:42.550946 systemd-networkd[1374]: calib07af2374c9: Gained IPv6LL Dec 13 13:32:42.560620 kubelet[2825]: I1213 13:32:42.559672 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-hgwlz" podStartSLOduration=27.559628433 podStartE2EDuration="27.559628433s" podCreationTimestamp="2024-12-13 13:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-12-13 13:32:42.558576913 +0000 UTC m=+43.035884175" watchObservedRunningTime="2024-12-13 13:32:42.559628433 +0000 UTC m=+43.036935695" Dec 13 13:32:42.743571 systemd-networkd[1374]: cali130ddcaa285: Gained IPv6LL Dec 13 13:32:42.870773 systemd-networkd[1374]: calib8b2985f2e4: Gained IPv6LL Dec 13 13:32:42.934853 systemd-networkd[1374]: cali9b79365c4e4: Gained IPv6LL Dec 13 13:32:42.998750 systemd-networkd[1374]: calid10df53f69e: Gained IPv6LL Dec 13 13:32:42.999082 systemd-networkd[1374]: caliebc12aa71a1: Gained IPv6LL Dec 13 13:32:43.014526 kernel: bpftool[5450]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Dec 13 13:32:43.252319 systemd-networkd[1374]: vxlan.calico: Link UP Dec 13 13:32:43.252327 systemd-networkd[1374]: vxlan.calico: Gained carrier Dec 13 13:32:44.983820 systemd-networkd[1374]: vxlan.calico: Gained IPv6LL Dec 13 13:32:45.506431 containerd[1480]: time="2024-12-13T13:32:45.506358968Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:45.509023 containerd[1480]: time="2024-12-13T13:32:45.508413687Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=39298409" Dec 13 13:32:45.509023 containerd[1480]: time="2024-12-13T13:32:45.508951147Z" level=info msg="ImageCreate event name:\"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:45.512531 containerd[1480]: time="2024-12-13T13:32:45.512431561Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:45.513713 containerd[1480]: time="2024-12-13T13:32:45.513658249Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 3.960485113s" Dec 13 13:32:45.513713 containerd[1480]: time="2024-12-13T13:32:45.513702290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Dec 13 13:32:45.514609 containerd[1480]: time="2024-12-13T13:32:45.514283313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Dec 13 13:32:45.517706 containerd[1480]: time="2024-12-13T13:32:45.517662483Z" level=info msg="CreateContainer within sandbox \"b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Dec 13 13:32:45.558890 containerd[1480]: time="2024-12-13T13:32:45.558830429Z" level=info msg="CreateContainer within sandbox \"b7198ff0f15252dea329209e6a222fb82b5459495e73b53c7fa6d695b357f260\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8cafeaf6959fe775ad7217bc981cdb7d916ecbae6423a50ff06d04d2e79cac9e\"" Dec 13 13:32:45.561200 containerd[1480]: time="2024-12-13T13:32:45.559717263Z" level=info msg="StartContainer for \"8cafeaf6959fe775ad7217bc981cdb7d916ecbae6423a50ff06d04d2e79cac9e\"" Dec 13 13:32:45.594249 systemd[1]: run-containerd-runc-k8s.io-8cafeaf6959fe775ad7217bc981cdb7d916ecbae6423a50ff06d04d2e79cac9e-runc.Jysr50.mount: Deactivated successfully. Dec 13 13:32:45.605840 systemd[1]: Started cri-containerd-8cafeaf6959fe775ad7217bc981cdb7d916ecbae6423a50ff06d04d2e79cac9e.scope - libcontainer container 8cafeaf6959fe775ad7217bc981cdb7d916ecbae6423a50ff06d04d2e79cac9e. Dec 13 13:32:45.647399 containerd[1480]: time="2024-12-13T13:32:45.647343519Z" level=info msg="StartContainer for \"8cafeaf6959fe775ad7217bc981cdb7d916ecbae6423a50ff06d04d2e79cac9e\" returns successfully" Dec 13 13:32:46.971968 containerd[1480]: time="2024-12-13T13:32:46.971895325Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:46.975425 containerd[1480]: time="2024-12-13T13:32:46.975353499Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7464730" Dec 13 13:32:46.977365 containerd[1480]: time="2024-12-13T13:32:46.977310535Z" level=info msg="ImageCreate event name:\"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:46.982335 containerd[1480]: time="2024-12-13T13:32:46.982281127Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"8834384\" in 1.467956012s" Dec 13 13:32:46.982607 containerd[1480]: time="2024-12-13T13:32:46.982393251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\"" Dec 13 13:32:46.984118 containerd[1480]: time="2024-12-13T13:32:46.983982872Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:46.985424 containerd[1480]: time="2024-12-13T13:32:46.984855226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Dec 13 13:32:46.987576 containerd[1480]: time="2024-12-13T13:32:46.987441966Z" level=info msg="CreateContainer within sandbox \"d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Dec 13 13:32:47.025995 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount585646461.mount: Deactivated successfully. Dec 13 13:32:47.041735 containerd[1480]: time="2024-12-13T13:32:47.041646943Z" level=info msg="CreateContainer within sandbox \"d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"746915be0df6db5aa687aef12c780173c34989255ba16d98a99f0cab1891d6c7\"" Dec 13 13:32:47.044420 containerd[1480]: time="2024-12-13T13:32:47.043843588Z" level=info msg="StartContainer for \"746915be0df6db5aa687aef12c780173c34989255ba16d98a99f0cab1891d6c7\"" Dec 13 13:32:47.109008 systemd[1]: Started cri-containerd-746915be0df6db5aa687aef12c780173c34989255ba16d98a99f0cab1891d6c7.scope - libcontainer container 746915be0df6db5aa687aef12c780173c34989255ba16d98a99f0cab1891d6c7. Dec 13 13:32:47.174263 containerd[1480]: time="2024-12-13T13:32:47.174204116Z" level=info msg="StartContainer for \"746915be0df6db5aa687aef12c780173c34989255ba16d98a99f0cab1891d6c7\" returns successfully" Dec 13 13:32:47.371195 containerd[1480]: time="2024-12-13T13:32:47.370047818Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:47.371195 containerd[1480]: time="2024-12-13T13:32:47.371004735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Dec 13 13:32:47.380153 containerd[1480]: time="2024-12-13T13:32:47.379944801Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 395.039654ms" Dec 13 13:32:47.380153 containerd[1480]: time="2024-12-13T13:32:47.380008364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Dec 13 13:32:47.382010 containerd[1480]: time="2024-12-13T13:32:47.381921278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Dec 13 13:32:47.383900 containerd[1480]: time="2024-12-13T13:32:47.383808431Z" level=info msg="CreateContainer within sandbox \"4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Dec 13 13:32:47.410709 containerd[1480]: time="2024-12-13T13:32:47.410655790Z" level=info msg="CreateContainer within sandbox \"4d385c4a6116826bd738269aad9074eff0bc58860f6fa5a0d845f30aa573f44f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b66da8a2f29d6a7a46bc5cae43b4c38d48419d3277200ec47d75f61db8bf1140\"" Dec 13 13:32:47.416575 containerd[1480]: time="2024-12-13T13:32:47.416316890Z" level=info msg="StartContainer for \"b66da8a2f29d6a7a46bc5cae43b4c38d48419d3277200ec47d75f61db8bf1140\"" Dec 13 13:32:47.447766 systemd[1]: Started cri-containerd-b66da8a2f29d6a7a46bc5cae43b4c38d48419d3277200ec47d75f61db8bf1140.scope - libcontainer container b66da8a2f29d6a7a46bc5cae43b4c38d48419d3277200ec47d75f61db8bf1140. Dec 13 13:32:47.507217 containerd[1480]: time="2024-12-13T13:32:47.506284453Z" level=info msg="StartContainer for \"b66da8a2f29d6a7a46bc5cae43b4c38d48419d3277200ec47d75f61db8bf1140\" returns successfully" Dec 13 13:32:47.579467 kubelet[2825]: I1213 13:32:47.579432 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 13 13:32:47.600289 kubelet[2825]: I1213 13:32:47.600239 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7db64dc7d4-8w5p4" podStartSLOduration=20.276959539 podStartE2EDuration="25.600166248s" podCreationTimestamp="2024-12-13 13:32:22 +0000 UTC" firstStartedPulling="2024-12-13 13:32:42.057173549 +0000 UTC m=+42.534480811" lastFinishedPulling="2024-12-13 13:32:47.380380298 +0000 UTC m=+47.857687520" observedRunningTime="2024-12-13 13:32:47.599636027 +0000 UTC m=+48.076943369" watchObservedRunningTime="2024-12-13 13:32:47.600166248 +0000 UTC m=+48.077473550" Dec 13 13:32:47.600478 kubelet[2825]: I1213 13:32:47.600425 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7db64dc7d4-49csf" podStartSLOduration=21.633387895 podStartE2EDuration="25.600403097s" podCreationTimestamp="2024-12-13 13:32:22 +0000 UTC" firstStartedPulling="2024-12-13 13:32:41.547023941 +0000 UTC m=+42.024331203" lastFinishedPulling="2024-12-13 13:32:45.514039143 +0000 UTC m=+45.991346405" observedRunningTime="2024-12-13 13:32:46.567051849 +0000 UTC m=+47.044359111" watchObservedRunningTime="2024-12-13 13:32:47.600403097 +0000 UTC m=+48.077710359" Dec 13 13:32:48.017939 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1816380278.mount: Deactivated successfully. Dec 13 13:32:48.582157 kubelet[2825]: I1213 13:32:48.582079 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 13 13:32:50.244567 containerd[1480]: time="2024-12-13T13:32:50.244456561Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:50.246036 containerd[1480]: time="2024-12-13T13:32:50.245961459Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=31953828" Dec 13 13:32:50.248350 containerd[1480]: time="2024-12-13T13:32:50.248285950Z" level=info msg="ImageCreate event name:\"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:50.251517 containerd[1480]: time="2024-12-13T13:32:50.251442313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:50.253269 containerd[1480]: time="2024-12-13T13:32:50.253179981Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"33323450\" in 2.87120142s" Dec 13 13:32:50.253269 containerd[1480]: time="2024-12-13T13:32:50.253237583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\"" Dec 13 13:32:50.254660 containerd[1480]: time="2024-12-13T13:32:50.254092816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Dec 13 13:32:50.294262 containerd[1480]: time="2024-12-13T13:32:50.293948610Z" level=info msg="CreateContainer within sandbox \"f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Dec 13 13:32:50.317502 containerd[1480]: time="2024-12-13T13:32:50.317449806Z" level=info msg="CreateContainer within sandbox \"f69c1aa7871d629ec922a948b68d83d19e92c6dfe52e53976feb43b4311f0a95\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe\"" Dec 13 13:32:50.319443 containerd[1480]: time="2024-12-13T13:32:50.318035749Z" level=info msg="StartContainer for \"7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe\"" Dec 13 13:32:50.358006 systemd[1]: Started cri-containerd-7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe.scope - libcontainer container 7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe. Dec 13 13:32:50.414992 containerd[1480]: time="2024-12-13T13:32:50.414900605Z" level=info msg="StartContainer for \"7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe\" returns successfully" Dec 13 13:32:51.678480 kubelet[2825]: I1213 13:32:51.676987 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-568896bf68-gs24f" podStartSLOduration=19.543615058 podStartE2EDuration="27.676939335s" podCreationTimestamp="2024-12-13 13:32:24 +0000 UTC" firstStartedPulling="2024-12-13 13:32:42.120540531 +0000 UTC m=+42.597847753" lastFinishedPulling="2024-12-13 13:32:50.253864768 +0000 UTC m=+50.731172030" observedRunningTime="2024-12-13 13:32:50.612005848 +0000 UTC m=+51.089313110" watchObservedRunningTime="2024-12-13 13:32:51.676939335 +0000 UTC m=+52.154246557" Dec 13 13:32:51.881617 containerd[1480]: time="2024-12-13T13:32:51.881545247Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:51.884377 containerd[1480]: time="2024-12-13T13:32:51.884310915Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=9883368" Dec 13 13:32:51.885788 containerd[1480]: time="2024-12-13T13:32:51.885711450Z" level=info msg="ImageCreate event name:\"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:51.889346 containerd[1480]: time="2024-12-13T13:32:51.888898895Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 13 13:32:51.889894 containerd[1480]: time="2024-12-13T13:32:51.889857852Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11252974\" in 1.635720354s" Dec 13 13:32:51.889894 containerd[1480]: time="2024-12-13T13:32:51.889893733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\"" Dec 13 13:32:51.893986 containerd[1480]: time="2024-12-13T13:32:51.893925811Z" level=info msg="CreateContainer within sandbox \"d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Dec 13 13:32:51.922753 containerd[1480]: time="2024-12-13T13:32:51.922586850Z" level=info msg="CreateContainer within sandbox \"d9597283e1740f559e3cb2359f19a03fdb31c19dc3fc2fd9a8824a54fd5d1427\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3286d9e7b9e9eae817fd68baca43153f3679373a930c7c1841a78cdd54f8cded\"" Dec 13 13:32:51.924589 containerd[1480]: time="2024-12-13T13:32:51.923445764Z" level=info msg="StartContainer for \"3286d9e7b9e9eae817fd68baca43153f3679373a930c7c1841a78cdd54f8cded\"" Dec 13 13:32:51.964793 systemd[1]: Started cri-containerd-3286d9e7b9e9eae817fd68baca43153f3679373a930c7c1841a78cdd54f8cded.scope - libcontainer container 3286d9e7b9e9eae817fd68baca43153f3679373a930c7c1841a78cdd54f8cded. Dec 13 13:32:52.013778 containerd[1480]: time="2024-12-13T13:32:52.013700171Z" level=info msg="StartContainer for \"3286d9e7b9e9eae817fd68baca43153f3679373a930c7c1841a78cdd54f8cded\" returns successfully" Dec 13 13:32:52.634295 kubelet[2825]: I1213 13:32:52.633869 2825 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-gvpdf" podStartSLOduration=19.371063944 podStartE2EDuration="29.633824004s" podCreationTimestamp="2024-12-13 13:32:23 +0000 UTC" firstStartedPulling="2024-12-13 13:32:41.627347082 +0000 UTC m=+42.104654304" lastFinishedPulling="2024-12-13 13:32:51.890107102 +0000 UTC m=+52.367414364" observedRunningTime="2024-12-13 13:32:52.622857814 +0000 UTC m=+53.100165116" watchObservedRunningTime="2024-12-13 13:32:52.633824004 +0000 UTC m=+53.111131266" Dec 13 13:32:52.837519 kubelet[2825]: I1213 13:32:52.837392 2825 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Dec 13 13:32:52.837930 kubelet[2825]: I1213 13:32:52.837575 2825 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Dec 13 13:32:58.734548 kubelet[2825]: I1213 13:32:58.732575 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 13 13:32:59.684679 containerd[1480]: time="2024-12-13T13:32:59.684631579Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:59.685692 containerd[1480]: time="2024-12-13T13:32:59.684755624Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:59.685692 containerd[1480]: time="2024-12-13T13:32:59.684766785Z" level=info msg="StopPodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:59.686185 containerd[1480]: time="2024-12-13T13:32:59.685948552Z" level=info msg="RemovePodSandbox for \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:59.686185 containerd[1480]: time="2024-12-13T13:32:59.686008954Z" level=info msg="Forcibly stopping sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\"" Dec 13 13:32:59.686185 containerd[1480]: time="2024-12-13T13:32:59.686084797Z" level=info msg="TearDown network for sandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" successfully" Dec 13 13:32:59.695517 containerd[1480]: time="2024-12-13T13:32:59.693656457Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.695517 containerd[1480]: time="2024-12-13T13:32:59.693911707Z" level=info msg="RemovePodSandbox \"c14b367666dadf56f43036012b1f44e725e0675c18b17a17a52db521d587cf1b\" returns successfully" Dec 13 13:32:59.699371 containerd[1480]: time="2024-12-13T13:32:59.699312081Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:59.699580 containerd[1480]: time="2024-12-13T13:32:59.699560571Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:59.699613 containerd[1480]: time="2024-12-13T13:32:59.699578852Z" level=info msg="StopPodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:59.700339 containerd[1480]: time="2024-12-13T13:32:59.700299360Z" level=info msg="RemovePodSandbox for \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:59.700339 containerd[1480]: time="2024-12-13T13:32:59.700341482Z" level=info msg="Forcibly stopping sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\"" Dec 13 13:32:59.700542 containerd[1480]: time="2024-12-13T13:32:59.700520529Z" level=info msg="TearDown network for sandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" successfully" Dec 13 13:32:59.709818 containerd[1480]: time="2024-12-13T13:32:59.709753575Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.710080 containerd[1480]: time="2024-12-13T13:32:59.710048627Z" level=info msg="RemovePodSandbox \"28ab22feaee7a90698140dd60f042eea82c7d4b839a2132dd54bbe4f52681246\" returns successfully" Dec 13 13:32:59.710875 containerd[1480]: time="2024-12-13T13:32:59.710832298Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:59.710998 containerd[1480]: time="2024-12-13T13:32:59.710982264Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:59.711037 containerd[1480]: time="2024-12-13T13:32:59.710997704Z" level=info msg="StopPodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:59.711607 containerd[1480]: time="2024-12-13T13:32:59.711492324Z" level=info msg="RemovePodSandbox for \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:59.711698 containerd[1480]: time="2024-12-13T13:32:59.711615009Z" level=info msg="Forcibly stopping sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\"" Dec 13 13:32:59.711724 containerd[1480]: time="2024-12-13T13:32:59.711700852Z" level=info msg="TearDown network for sandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" successfully" Dec 13 13:32:59.716143 containerd[1480]: time="2024-12-13T13:32:59.715907339Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.716143 containerd[1480]: time="2024-12-13T13:32:59.716075906Z" level=info msg="RemovePodSandbox \"011025220a34b5618daf9e946f064224b85e4164bb0a54bde2a65f320fa54464\" returns successfully" Dec 13 13:32:59.717425 containerd[1480]: time="2024-12-13T13:32:59.717154869Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:59.717425 containerd[1480]: time="2024-12-13T13:32:59.717328595Z" level=info msg="TearDown network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" successfully" Dec 13 13:32:59.717425 containerd[1480]: time="2024-12-13T13:32:59.717346996Z" level=info msg="StopPodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" returns successfully" Dec 13 13:32:59.719096 containerd[1480]: time="2024-12-13T13:32:59.718828015Z" level=info msg="RemovePodSandbox for \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:59.719096 containerd[1480]: time="2024-12-13T13:32:59.718888017Z" level=info msg="Forcibly stopping sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\"" Dec 13 13:32:59.719647 containerd[1480]: time="2024-12-13T13:32:59.718999102Z" level=info msg="TearDown network for sandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" successfully" Dec 13 13:32:59.726963 containerd[1480]: time="2024-12-13T13:32:59.726903135Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.727321 containerd[1480]: time="2024-12-13T13:32:59.727240108Z" level=info msg="RemovePodSandbox \"1aa3c22d091cc7fba54facf0d92eb782ecd6f62b20ff4556771ba96fd276d7ca\" returns successfully" Dec 13 13:32:59.729169 containerd[1480]: time="2024-12-13T13:32:59.728695686Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" Dec 13 13:32:59.729169 containerd[1480]: time="2024-12-13T13:32:59.728863693Z" level=info msg="TearDown network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" successfully" Dec 13 13:32:59.729169 containerd[1480]: time="2024-12-13T13:32:59.728880373Z" level=info msg="StopPodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" returns successfully" Dec 13 13:32:59.729367 containerd[1480]: time="2024-12-13T13:32:59.729337191Z" level=info msg="RemovePodSandbox for \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" Dec 13 13:32:59.729424 containerd[1480]: time="2024-12-13T13:32:59.729382553Z" level=info msg="Forcibly stopping sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\"" Dec 13 13:32:59.729624 containerd[1480]: time="2024-12-13T13:32:59.729570641Z" level=info msg="TearDown network for sandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" successfully" Dec 13 13:32:59.734860 containerd[1480]: time="2024-12-13T13:32:59.734603560Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.734860 containerd[1480]: time="2024-12-13T13:32:59.734695484Z" level=info msg="RemovePodSandbox \"79676f3f3e2f7a38baba40f66f780602bce7a10514debc26ebe8fcade8bba8c1\" returns successfully" Dec 13 13:32:59.736064 containerd[1480]: time="2024-12-13T13:32:59.736033697Z" level=info msg="StopPodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\"" Dec 13 13:32:59.736418 containerd[1480]: time="2024-12-13T13:32:59.736274906Z" level=info msg="TearDown network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" successfully" Dec 13 13:32:59.736418 containerd[1480]: time="2024-12-13T13:32:59.736295747Z" level=info msg="StopPodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" returns successfully" Dec 13 13:32:59.737078 containerd[1480]: time="2024-12-13T13:32:59.736639481Z" level=info msg="RemovePodSandbox for \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\"" Dec 13 13:32:59.737078 containerd[1480]: time="2024-12-13T13:32:59.736666362Z" level=info msg="Forcibly stopping sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\"" Dec 13 13:32:59.737078 containerd[1480]: time="2024-12-13T13:32:59.736739365Z" level=info msg="TearDown network for sandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" successfully" Dec 13 13:32:59.744863 containerd[1480]: time="2024-12-13T13:32:59.744770443Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.745214 containerd[1480]: time="2024-12-13T13:32:59.745191780Z" level=info msg="RemovePodSandbox \"56da58185e2ce28212ed60e59d5f809c47f4f2e08d65267e61c9408f5cae3054\" returns successfully" Dec 13 13:32:59.745808 containerd[1480]: time="2024-12-13T13:32:59.745771683Z" level=info msg="StopPodSandbox for \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\"" Dec 13 13:32:59.746338 containerd[1480]: time="2024-12-13T13:32:59.746120337Z" level=info msg="TearDown network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\" successfully" Dec 13 13:32:59.746338 containerd[1480]: time="2024-12-13T13:32:59.746167498Z" level=info msg="StopPodSandbox for \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\" returns successfully" Dec 13 13:32:59.748552 containerd[1480]: time="2024-12-13T13:32:59.747749321Z" level=info msg="RemovePodSandbox for \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\"" Dec 13 13:32:59.748552 containerd[1480]: time="2024-12-13T13:32:59.747805603Z" level=info msg="Forcibly stopping sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\"" Dec 13 13:32:59.748552 containerd[1480]: time="2024-12-13T13:32:59.747928408Z" level=info msg="TearDown network for sandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\" successfully" Dec 13 13:32:59.753902 containerd[1480]: time="2024-12-13T13:32:59.753809361Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.754085 containerd[1480]: time="2024-12-13T13:32:59.753940046Z" level=info msg="RemovePodSandbox \"d0e53622919489e669fcb206426cf0fc4d566a7c252069ca39fea38e491642bd\" returns successfully" Dec 13 13:32:59.754639 containerd[1480]: time="2024-12-13T13:32:59.754364783Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:59.754639 containerd[1480]: time="2024-12-13T13:32:59.754480548Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:59.754639 containerd[1480]: time="2024-12-13T13:32:59.754492508Z" level=info msg="StopPodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:59.755098 containerd[1480]: time="2024-12-13T13:32:59.755051490Z" level=info msg="RemovePodSandbox for \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:59.755098 containerd[1480]: time="2024-12-13T13:32:59.755095052Z" level=info msg="Forcibly stopping sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\"" Dec 13 13:32:59.755318 containerd[1480]: time="2024-12-13T13:32:59.755281900Z" level=info msg="TearDown network for sandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" successfully" Dec 13 13:32:59.761039 containerd[1480]: time="2024-12-13T13:32:59.760908123Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.761632 containerd[1480]: time="2024-12-13T13:32:59.761120771Z" level=info msg="RemovePodSandbox \"6aa1c96c5e6b9a245c04937a5347c481ea1ac5875ca7cfa3d366679bffff4c5c\" returns successfully" Dec 13 13:32:59.761964 containerd[1480]: time="2024-12-13T13:32:59.761883641Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:59.762031 containerd[1480]: time="2024-12-13T13:32:59.762000006Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:59.762031 containerd[1480]: time="2024-12-13T13:32:59.762012166Z" level=info msg="StopPodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:59.763116 containerd[1480]: time="2024-12-13T13:32:59.762679233Z" level=info msg="RemovePodSandbox for \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:59.763116 containerd[1480]: time="2024-12-13T13:32:59.762715154Z" level=info msg="Forcibly stopping sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\"" Dec 13 13:32:59.763116 containerd[1480]: time="2024-12-13T13:32:59.762815478Z" level=info msg="TearDown network for sandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" successfully" Dec 13 13:32:59.767525 containerd[1480]: time="2024-12-13T13:32:59.767454942Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.768167 containerd[1480]: time="2024-12-13T13:32:59.767764514Z" level=info msg="RemovePodSandbox \"3f17224462dba03cffd3dda02e9c81ba18db0c29088539fa5c02caf7686c13c9\" returns successfully" Dec 13 13:32:59.768374 containerd[1480]: time="2024-12-13T13:32:59.768293775Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:59.768480 containerd[1480]: time="2024-12-13T13:32:59.768432421Z" level=info msg="TearDown network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" successfully" Dec 13 13:32:59.768480 containerd[1480]: time="2024-12-13T13:32:59.768446381Z" level=info msg="StopPodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" returns successfully" Dec 13 13:32:59.768946 containerd[1480]: time="2024-12-13T13:32:59.768848957Z" level=info msg="RemovePodSandbox for \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:59.769532 containerd[1480]: time="2024-12-13T13:32:59.769052405Z" level=info msg="Forcibly stopping sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\"" Dec 13 13:32:59.769532 containerd[1480]: time="2024-12-13T13:32:59.769170130Z" level=info msg="TearDown network for sandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" successfully" Dec 13 13:32:59.774626 containerd[1480]: time="2024-12-13T13:32:59.774553783Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.774937 containerd[1480]: time="2024-12-13T13:32:59.774908477Z" level=info msg="RemovePodSandbox \"78993a652eb8465fe6ebcffec56f417654396b35e9a54b14e429b8a6365b68d1\" returns successfully" Dec 13 13:32:59.776237 containerd[1480]: time="2024-12-13T13:32:59.776168287Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" Dec 13 13:32:59.776386 containerd[1480]: time="2024-12-13T13:32:59.776358735Z" level=info msg="TearDown network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" successfully" Dec 13 13:32:59.776464 containerd[1480]: time="2024-12-13T13:32:59.776386336Z" level=info msg="StopPodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" returns successfully" Dec 13 13:32:59.777556 containerd[1480]: time="2024-12-13T13:32:59.777098524Z" level=info msg="RemovePodSandbox for \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" Dec 13 13:32:59.777556 containerd[1480]: time="2024-12-13T13:32:59.777132606Z" level=info msg="Forcibly stopping sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\"" Dec 13 13:32:59.777556 containerd[1480]: time="2024-12-13T13:32:59.777213809Z" level=info msg="TearDown network for sandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" successfully" Dec 13 13:32:59.782822 containerd[1480]: time="2024-12-13T13:32:59.782755789Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.782998 containerd[1480]: time="2024-12-13T13:32:59.782856793Z" level=info msg="RemovePodSandbox \"5f70959d15a376529c7e9367b4c32eb28360ae76de0a7484d851557add2fa306\" returns successfully" Dec 13 13:32:59.783843 containerd[1480]: time="2024-12-13T13:32:59.783636623Z" level=info msg="StopPodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\"" Dec 13 13:32:59.783843 containerd[1480]: time="2024-12-13T13:32:59.783771269Z" level=info msg="TearDown network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" successfully" Dec 13 13:32:59.783843 containerd[1480]: time="2024-12-13T13:32:59.783785949Z" level=info msg="StopPodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" returns successfully" Dec 13 13:32:59.785521 containerd[1480]: time="2024-12-13T13:32:59.785219006Z" level=info msg="RemovePodSandbox for \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\"" Dec 13 13:32:59.785521 containerd[1480]: time="2024-12-13T13:32:59.785272568Z" level=info msg="Forcibly stopping sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\"" Dec 13 13:32:59.785521 containerd[1480]: time="2024-12-13T13:32:59.785406134Z" level=info msg="TearDown network for sandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" successfully" Dec 13 13:32:59.793043 containerd[1480]: time="2024-12-13T13:32:59.792970153Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.793043 containerd[1480]: time="2024-12-13T13:32:59.793052677Z" level=info msg="RemovePodSandbox \"493b3394a3e783e794ecbdac0b87b962dcefb3c7f71ead0a3cc956ebe3fc6ec3\" returns successfully" Dec 13 13:32:59.793704 containerd[1480]: time="2024-12-13T13:32:59.793675461Z" level=info msg="StopPodSandbox for \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\"" Dec 13 13:32:59.794219 containerd[1480]: time="2024-12-13T13:32:59.794077477Z" level=info msg="TearDown network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\" successfully" Dec 13 13:32:59.794219 containerd[1480]: time="2024-12-13T13:32:59.794110039Z" level=info msg="StopPodSandbox for \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\" returns successfully" Dec 13 13:32:59.795566 containerd[1480]: time="2024-12-13T13:32:59.794533375Z" level=info msg="RemovePodSandbox for \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\"" Dec 13 13:32:59.795566 containerd[1480]: time="2024-12-13T13:32:59.794572377Z" level=info msg="Forcibly stopping sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\"" Dec 13 13:32:59.795566 containerd[1480]: time="2024-12-13T13:32:59.794658500Z" level=info msg="TearDown network for sandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\" successfully" Dec 13 13:32:59.801981 containerd[1480]: time="2024-12-13T13:32:59.801933869Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.802374 containerd[1480]: time="2024-12-13T13:32:59.802339885Z" level=info msg="RemovePodSandbox \"6b6d663889e5f85a306c1ca4d21c422a5c97bd28ac2707ba2012c713ceab6e9f\" returns successfully" Dec 13 13:32:59.803033 containerd[1480]: time="2024-12-13T13:32:59.803006671Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:59.803437 containerd[1480]: time="2024-12-13T13:32:59.803413407Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:59.803579 containerd[1480]: time="2024-12-13T13:32:59.803555293Z" level=info msg="StopPodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:59.804315 containerd[1480]: time="2024-12-13T13:32:59.804280162Z" level=info msg="RemovePodSandbox for \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:59.804426 containerd[1480]: time="2024-12-13T13:32:59.804322203Z" level=info msg="Forcibly stopping sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\"" Dec 13 13:32:59.804472 containerd[1480]: time="2024-12-13T13:32:59.804447088Z" level=info msg="TearDown network for sandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" successfully" Dec 13 13:32:59.810753 containerd[1480]: time="2024-12-13T13:32:59.810568331Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.810753 containerd[1480]: time="2024-12-13T13:32:59.810653334Z" level=info msg="RemovePodSandbox \"4ad3e5f0ccff376a156effaed8457c291b79531b07a4eb461b5a78e4779003e1\" returns successfully" Dec 13 13:32:59.811378 containerd[1480]: time="2024-12-13T13:32:59.811188475Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:59.813216 containerd[1480]: time="2024-12-13T13:32:59.812695215Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:59.813216 containerd[1480]: time="2024-12-13T13:32:59.812732217Z" level=info msg="StopPodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:59.814904 containerd[1480]: time="2024-12-13T13:32:59.813604011Z" level=info msg="RemovePodSandbox for \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:59.814904 containerd[1480]: time="2024-12-13T13:32:59.813637332Z" level=info msg="Forcibly stopping sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\"" Dec 13 13:32:59.814904 containerd[1480]: time="2024-12-13T13:32:59.813721776Z" level=info msg="TearDown network for sandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" successfully" Dec 13 13:32:59.818193 containerd[1480]: time="2024-12-13T13:32:59.818142151Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.818430 containerd[1480]: time="2024-12-13T13:32:59.818407041Z" level=info msg="RemovePodSandbox \"3722409012b7d75a2bedb23a17534716b5b60ad7579bee0614da0cdb49b85094\" returns successfully" Dec 13 13:32:59.818995 containerd[1480]: time="2024-12-13T13:32:59.818965224Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:59.819120 containerd[1480]: time="2024-12-13T13:32:59.819095389Z" level=info msg="TearDown network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" successfully" Dec 13 13:32:59.819176 containerd[1480]: time="2024-12-13T13:32:59.819120590Z" level=info msg="StopPodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" returns successfully" Dec 13 13:32:59.819670 containerd[1480]: time="2024-12-13T13:32:59.819644931Z" level=info msg="RemovePodSandbox for \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:59.820175 containerd[1480]: time="2024-12-13T13:32:59.819773416Z" level=info msg="Forcibly stopping sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\"" Dec 13 13:32:59.820175 containerd[1480]: time="2024-12-13T13:32:59.819878780Z" level=info msg="TearDown network for sandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" successfully" Dec 13 13:32:59.823678 containerd[1480]: time="2024-12-13T13:32:59.823624088Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.823908 containerd[1480]: time="2024-12-13T13:32:59.823886419Z" level=info msg="RemovePodSandbox \"9c9115aa337437b355c940401a3d48ba7f3b4ebc08edd29b546c7e7ac7665087\" returns successfully" Dec 13 13:32:59.824653 containerd[1480]: time="2024-12-13T13:32:59.824613167Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" Dec 13 13:32:59.824757 containerd[1480]: time="2024-12-13T13:32:59.824731892Z" level=info msg="TearDown network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" successfully" Dec 13 13:32:59.824757 containerd[1480]: time="2024-12-13T13:32:59.824742093Z" level=info msg="StopPodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" returns successfully" Dec 13 13:32:59.825642 containerd[1480]: time="2024-12-13T13:32:59.825076746Z" level=info msg="RemovePodSandbox for \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" Dec 13 13:32:59.825642 containerd[1480]: time="2024-12-13T13:32:59.825105107Z" level=info msg="Forcibly stopping sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\"" Dec 13 13:32:59.825642 containerd[1480]: time="2024-12-13T13:32:59.825182070Z" level=info msg="TearDown network for sandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" successfully" Dec 13 13:32:59.829431 containerd[1480]: time="2024-12-13T13:32:59.829378076Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.829658 containerd[1480]: time="2024-12-13T13:32:59.829637887Z" level=info msg="RemovePodSandbox \"3004ccbdf7c63b4d5345bd8da064cc15a3b9f08279de0511aa4b81fa45bf0ea0\" returns successfully" Dec 13 13:32:59.830287 containerd[1480]: time="2024-12-13T13:32:59.830260711Z" level=info msg="StopPodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\"" Dec 13 13:32:59.830621 containerd[1480]: time="2024-12-13T13:32:59.830596525Z" level=info msg="TearDown network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" successfully" Dec 13 13:32:59.831007 containerd[1480]: time="2024-12-13T13:32:59.830693088Z" level=info msg="StopPodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" returns successfully" Dec 13 13:32:59.831176 containerd[1480]: time="2024-12-13T13:32:59.831138386Z" level=info msg="RemovePodSandbox for \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\"" Dec 13 13:32:59.831232 containerd[1480]: time="2024-12-13T13:32:59.831174428Z" level=info msg="Forcibly stopping sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\"" Dec 13 13:32:59.831280 containerd[1480]: time="2024-12-13T13:32:59.831261431Z" level=info msg="TearDown network for sandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" successfully" Dec 13 13:32:59.836560 containerd[1480]: time="2024-12-13T13:32:59.836186666Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.836560 containerd[1480]: time="2024-12-13T13:32:59.836417595Z" level=info msg="RemovePodSandbox \"a6707f8ea65dec509863910b6b54c7fa901319590fea257f2b43a7b421ff45fc\" returns successfully" Dec 13 13:32:59.837319 containerd[1480]: time="2024-12-13T13:32:59.837135144Z" level=info msg="StopPodSandbox for \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\"" Dec 13 13:32:59.837319 containerd[1480]: time="2024-12-13T13:32:59.837251908Z" level=info msg="TearDown network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\" successfully" Dec 13 13:32:59.837319 containerd[1480]: time="2024-12-13T13:32:59.837262069Z" level=info msg="StopPodSandbox for \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\" returns successfully" Dec 13 13:32:59.839123 containerd[1480]: time="2024-12-13T13:32:59.838921655Z" level=info msg="RemovePodSandbox for \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\"" Dec 13 13:32:59.839123 containerd[1480]: time="2024-12-13T13:32:59.838959056Z" level=info msg="Forcibly stopping sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\"" Dec 13 13:32:59.839123 containerd[1480]: time="2024-12-13T13:32:59.839045019Z" level=info msg="TearDown network for sandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\" successfully" Dec 13 13:32:59.844913 containerd[1480]: time="2024-12-13T13:32:59.844876571Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.845553 containerd[1480]: time="2024-12-13T13:32:59.845518516Z" level=info msg="RemovePodSandbox \"bde286f4d1ef74e921336bcd0af9731caa3d9303371e3824081a844c4b2ad1b8\" returns successfully" Dec 13 13:32:59.846509 containerd[1480]: time="2024-12-13T13:32:59.846232424Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:59.846509 containerd[1480]: time="2024-12-13T13:32:59.846329788Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:59.846509 containerd[1480]: time="2024-12-13T13:32:59.846338829Z" level=info msg="StopPodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:59.847669 containerd[1480]: time="2024-12-13T13:32:59.847633960Z" level=info msg="RemovePodSandbox for \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:59.847669 containerd[1480]: time="2024-12-13T13:32:59.847671121Z" level=info msg="Forcibly stopping sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\"" Dec 13 13:32:59.847797 containerd[1480]: time="2024-12-13T13:32:59.847754045Z" level=info msg="TearDown network for sandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" successfully" Dec 13 13:32:59.853175 containerd[1480]: time="2024-12-13T13:32:59.853130018Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.853323 containerd[1480]: time="2024-12-13T13:32:59.853206181Z" level=info msg="RemovePodSandbox \"234a182a1e3f0f76542e8bdab8cfcefd32f9b42d4bf5f4a9200aa38b6573279c\" returns successfully" Dec 13 13:32:59.854144 containerd[1480]: time="2024-12-13T13:32:59.853834286Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:59.854144 containerd[1480]: time="2024-12-13T13:32:59.853948010Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:59.854144 containerd[1480]: time="2024-12-13T13:32:59.853957810Z" level=info msg="StopPodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:59.854753 containerd[1480]: time="2024-12-13T13:32:59.854572595Z" level=info msg="RemovePodSandbox for \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:59.854753 containerd[1480]: time="2024-12-13T13:32:59.854609716Z" level=info msg="Forcibly stopping sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\"" Dec 13 13:32:59.854753 containerd[1480]: time="2024-12-13T13:32:59.854683319Z" level=info msg="TearDown network for sandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" successfully" Dec 13 13:32:59.859186 containerd[1480]: time="2024-12-13T13:32:59.858814963Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.859186 containerd[1480]: time="2024-12-13T13:32:59.858888726Z" level=info msg="RemovePodSandbox \"a6c22195c9867d68395baaef900a3129e1e3a991f9c4d76b337fdfbbac22ee5b\" returns successfully" Dec 13 13:32:59.859805 containerd[1480]: time="2024-12-13T13:32:59.859564313Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:59.859805 containerd[1480]: time="2024-12-13T13:32:59.859670117Z" level=info msg="TearDown network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" successfully" Dec 13 13:32:59.859805 containerd[1480]: time="2024-12-13T13:32:59.859679917Z" level=info msg="StopPodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" returns successfully" Dec 13 13:32:59.860268 containerd[1480]: time="2024-12-13T13:32:59.860245860Z" level=info msg="RemovePodSandbox for \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:59.860340 containerd[1480]: time="2024-12-13T13:32:59.860278781Z" level=info msg="Forcibly stopping sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\"" Dec 13 13:32:59.860363 containerd[1480]: time="2024-12-13T13:32:59.860353824Z" level=info msg="TearDown network for sandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" successfully" Dec 13 13:32:59.864349 containerd[1480]: time="2024-12-13T13:32:59.864293780Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.864486 containerd[1480]: time="2024-12-13T13:32:59.864379944Z" level=info msg="RemovePodSandbox \"70002239ed097c5b7b7feea4b186c2abda6d69822f50b69d11eceaf059c10b31\" returns successfully" Dec 13 13:32:59.865236 containerd[1480]: time="2024-12-13T13:32:59.864935246Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" Dec 13 13:32:59.865236 containerd[1480]: time="2024-12-13T13:32:59.865047490Z" level=info msg="TearDown network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" successfully" Dec 13 13:32:59.865236 containerd[1480]: time="2024-12-13T13:32:59.865058210Z" level=info msg="StopPodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" returns successfully" Dec 13 13:32:59.866059 containerd[1480]: time="2024-12-13T13:32:59.865629793Z" level=info msg="RemovePodSandbox for \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" Dec 13 13:32:59.866059 containerd[1480]: time="2024-12-13T13:32:59.865677995Z" level=info msg="Forcibly stopping sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\"" Dec 13 13:32:59.866059 containerd[1480]: time="2024-12-13T13:32:59.865769919Z" level=info msg="TearDown network for sandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" successfully" Dec 13 13:32:59.870381 containerd[1480]: time="2024-12-13T13:32:59.870308699Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.870381 containerd[1480]: time="2024-12-13T13:32:59.870405502Z" level=info msg="RemovePodSandbox \"685df85ae8dbd1aa7ad5a7eeb14a8ea2bd30317a5cdb7c8a8304a0a8e6029933\" returns successfully" Dec 13 13:32:59.871388 containerd[1480]: time="2024-12-13T13:32:59.870873681Z" level=info msg="StopPodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\"" Dec 13 13:32:59.871388 containerd[1480]: time="2024-12-13T13:32:59.870982805Z" level=info msg="TearDown network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" successfully" Dec 13 13:32:59.871388 containerd[1480]: time="2024-12-13T13:32:59.870993726Z" level=info msg="StopPodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" returns successfully" Dec 13 13:32:59.871630 containerd[1480]: time="2024-12-13T13:32:59.871610590Z" level=info msg="RemovePodSandbox for \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\"" Dec 13 13:32:59.871663 containerd[1480]: time="2024-12-13T13:32:59.871639591Z" level=info msg="Forcibly stopping sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\"" Dec 13 13:32:59.871791 containerd[1480]: time="2024-12-13T13:32:59.871748836Z" level=info msg="TearDown network for sandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" successfully" Dec 13 13:32:59.876682 containerd[1480]: time="2024-12-13T13:32:59.876621589Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.876826 containerd[1480]: time="2024-12-13T13:32:59.876704512Z" level=info msg="RemovePodSandbox \"47b9f503f7f7f59848de3bd26c615853b9b26ce2ee4552d6deaebd00adf76a5d\" returns successfully" Dec 13 13:32:59.877520 containerd[1480]: time="2024-12-13T13:32:59.877304816Z" level=info msg="StopPodSandbox for \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\"" Dec 13 13:32:59.877660 containerd[1480]: time="2024-12-13T13:32:59.877488623Z" level=info msg="TearDown network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\" successfully" Dec 13 13:32:59.877831 containerd[1480]: time="2024-12-13T13:32:59.877717352Z" level=info msg="StopPodSandbox for \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\" returns successfully" Dec 13 13:32:59.878204 containerd[1480]: time="2024-12-13T13:32:59.878136289Z" level=info msg="RemovePodSandbox for \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\"" Dec 13 13:32:59.878204 containerd[1480]: time="2024-12-13T13:32:59.878183371Z" level=info msg="Forcibly stopping sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\"" Dec 13 13:32:59.878310 containerd[1480]: time="2024-12-13T13:32:59.878265774Z" level=info msg="TearDown network for sandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\" successfully" Dec 13 13:32:59.881964 containerd[1480]: time="2024-12-13T13:32:59.881912038Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.882382 containerd[1480]: time="2024-12-13T13:32:59.881990882Z" level=info msg="RemovePodSandbox \"d222dc1d21b96b3b0334cc5be87dd360a285cc5916404fb34fd34d0ade267889\" returns successfully" Dec 13 13:32:59.882971 containerd[1480]: time="2024-12-13T13:32:59.882646468Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:59.882971 containerd[1480]: time="2024-12-13T13:32:59.882759632Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:59.882971 containerd[1480]: time="2024-12-13T13:32:59.882771592Z" level=info msg="StopPodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:59.883859 containerd[1480]: time="2024-12-13T13:32:59.883668948Z" level=info msg="RemovePodSandbox for \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:59.883859 containerd[1480]: time="2024-12-13T13:32:59.883707430Z" level=info msg="Forcibly stopping sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\"" Dec 13 13:32:59.883859 containerd[1480]: time="2024-12-13T13:32:59.883801113Z" level=info msg="TearDown network for sandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" successfully" Dec 13 13:32:59.888215 containerd[1480]: time="2024-12-13T13:32:59.888169286Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.889029 containerd[1480]: time="2024-12-13T13:32:59.888386735Z" level=info msg="RemovePodSandbox \"4dd62aedcc96b0a1c919011bf687aadf2ad9eb71c06ad3c4329e565ec8fe542f\" returns successfully" Dec 13 13:32:59.889029 containerd[1480]: time="2024-12-13T13:32:59.888858194Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:59.889029 containerd[1480]: time="2024-12-13T13:32:59.888969238Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:59.889029 containerd[1480]: time="2024-12-13T13:32:59.888979719Z" level=info msg="StopPodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:59.889728 containerd[1480]: time="2024-12-13T13:32:59.889708547Z" level=info msg="RemovePodSandbox for \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:59.890174 containerd[1480]: time="2024-12-13T13:32:59.889911315Z" level=info msg="Forcibly stopping sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\"" Dec 13 13:32:59.890174 containerd[1480]: time="2024-12-13T13:32:59.889993839Z" level=info msg="TearDown network for sandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" successfully" Dec 13 13:32:59.893971 containerd[1480]: time="2024-12-13T13:32:59.893925795Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.894538 containerd[1480]: time="2024-12-13T13:32:59.894185485Z" level=info msg="RemovePodSandbox \"84b97dcad8d3f5cc6fe67ae527aea626abc23d4a5c1c6637255cb125861c4bf6\" returns successfully" Dec 13 13:32:59.894979 containerd[1480]: time="2024-12-13T13:32:59.894940595Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:59.895166 containerd[1480]: time="2024-12-13T13:32:59.895140603Z" level=info msg="TearDown network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" successfully" Dec 13 13:32:59.895211 containerd[1480]: time="2024-12-13T13:32:59.895167404Z" level=info msg="StopPodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" returns successfully" Dec 13 13:32:59.897384 containerd[1480]: time="2024-12-13T13:32:59.895820950Z" level=info msg="RemovePodSandbox for \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:59.897384 containerd[1480]: time="2024-12-13T13:32:59.895852071Z" level=info msg="Forcibly stopping sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\"" Dec 13 13:32:59.897384 containerd[1480]: time="2024-12-13T13:32:59.895947235Z" level=info msg="TearDown network for sandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" successfully" Dec 13 13:32:59.899768 containerd[1480]: time="2024-12-13T13:32:59.899572818Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.899768 containerd[1480]: time="2024-12-13T13:32:59.899659262Z" level=info msg="RemovePodSandbox \"0109fb4666e066d53a6109f6e48ef33ae8fe78135cf7054dd5d5c1c2b75674fe\" returns successfully" Dec 13 13:32:59.900487 containerd[1480]: time="2024-12-13T13:32:59.900453053Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" Dec 13 13:32:59.900754 containerd[1480]: time="2024-12-13T13:32:59.900629260Z" level=info msg="TearDown network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" successfully" Dec 13 13:32:59.900785 containerd[1480]: time="2024-12-13T13:32:59.900756225Z" level=info msg="StopPodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" returns successfully" Dec 13 13:32:59.901735 containerd[1480]: time="2024-12-13T13:32:59.901664781Z" level=info msg="RemovePodSandbox for \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" Dec 13 13:32:59.901825 containerd[1480]: time="2024-12-13T13:32:59.901744624Z" level=info msg="Forcibly stopping sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\"" Dec 13 13:32:59.901905 containerd[1480]: time="2024-12-13T13:32:59.901870189Z" level=info msg="TearDown network for sandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" successfully" Dec 13 13:32:59.907247 containerd[1480]: time="2024-12-13T13:32:59.907153879Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.907247 containerd[1480]: time="2024-12-13T13:32:59.907247603Z" level=info msg="RemovePodSandbox \"11b32506434d9c8a49c7dafd4f603db4b504e963148095332f5fc940af33dcb5\" returns successfully" Dec 13 13:32:59.908727 containerd[1480]: time="2024-12-13T13:32:59.908183720Z" level=info msg="StopPodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\"" Dec 13 13:32:59.908727 containerd[1480]: time="2024-12-13T13:32:59.908351366Z" level=info msg="TearDown network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" successfully" Dec 13 13:32:59.908727 containerd[1480]: time="2024-12-13T13:32:59.908368567Z" level=info msg="StopPodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" returns successfully" Dec 13 13:32:59.908966 containerd[1480]: time="2024-12-13T13:32:59.908866307Z" level=info msg="RemovePodSandbox for \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\"" Dec 13 13:32:59.908966 containerd[1480]: time="2024-12-13T13:32:59.908894828Z" level=info msg="Forcibly stopping sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\"" Dec 13 13:32:59.909042 containerd[1480]: time="2024-12-13T13:32:59.908977231Z" level=info msg="TearDown network for sandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" successfully" Dec 13 13:32:59.912797 containerd[1480]: time="2024-12-13T13:32:59.912733420Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.913324 containerd[1480]: time="2024-12-13T13:32:59.912819023Z" level=info msg="RemovePodSandbox \"aeab26363dc30af973ad26426df4fcb719df6377afaaa90db450f8c4f579ba1c\" returns successfully" Dec 13 13:32:59.913390 containerd[1480]: time="2024-12-13T13:32:59.913330044Z" level=info msg="StopPodSandbox for \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\"" Dec 13 13:32:59.913545 containerd[1480]: time="2024-12-13T13:32:59.913471289Z" level=info msg="TearDown network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\" successfully" Dec 13 13:32:59.913545 containerd[1480]: time="2024-12-13T13:32:59.913512571Z" level=info msg="StopPodSandbox for \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\" returns successfully" Dec 13 13:32:59.914104 containerd[1480]: time="2024-12-13T13:32:59.914079473Z" level=info msg="RemovePodSandbox for \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\"" Dec 13 13:32:59.914176 containerd[1480]: time="2024-12-13T13:32:59.914115395Z" level=info msg="Forcibly stopping sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\"" Dec 13 13:32:59.914209 containerd[1480]: time="2024-12-13T13:32:59.914193998Z" level=info msg="TearDown network for sandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\" successfully" Dec 13 13:32:59.917861 containerd[1480]: time="2024-12-13T13:32:59.917795581Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.918038 containerd[1480]: time="2024-12-13T13:32:59.917887584Z" level=info msg="RemovePodSandbox \"8972d3156f5d0c9124406ed72e65e88c2264625d2d63cbdeff760491987f461b\" returns successfully" Dec 13 13:32:59.918559 containerd[1480]: time="2024-12-13T13:32:59.918528170Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:59.918662 containerd[1480]: time="2024-12-13T13:32:59.918646734Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:59.918662 containerd[1480]: time="2024-12-13T13:32:59.918657975Z" level=info msg="StopPodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:59.919182 containerd[1480]: time="2024-12-13T13:32:59.919148634Z" level=info msg="RemovePodSandbox for \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:59.919289 containerd[1480]: time="2024-12-13T13:32:59.919181996Z" level=info msg="Forcibly stopping sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\"" Dec 13 13:32:59.919289 containerd[1480]: time="2024-12-13T13:32:59.919261639Z" level=info msg="TearDown network for sandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" successfully" Dec 13 13:32:59.922605 containerd[1480]: time="2024-12-13T13:32:59.922525848Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.922727 containerd[1480]: time="2024-12-13T13:32:59.922614452Z" level=info msg="RemovePodSandbox \"8addfa25b5dc55988a30ad12d8ea311ac276a222d8f8806bb41436a5fa51fa0c\" returns successfully" Dec 13 13:32:59.923176 containerd[1480]: time="2024-12-13T13:32:59.923148633Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:59.923414 containerd[1480]: time="2024-12-13T13:32:59.923254037Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:59.923414 containerd[1480]: time="2024-12-13T13:32:59.923272038Z" level=info msg="StopPodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:59.925532 containerd[1480]: time="2024-12-13T13:32:59.923763497Z" level=info msg="RemovePodSandbox for \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:59.925532 containerd[1480]: time="2024-12-13T13:32:59.923809659Z" level=info msg="Forcibly stopping sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\"" Dec 13 13:32:59.925532 containerd[1480]: time="2024-12-13T13:32:59.923923143Z" level=info msg="TearDown network for sandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" successfully" Dec 13 13:32:59.929244 containerd[1480]: time="2024-12-13T13:32:59.929169791Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.929512 containerd[1480]: time="2024-12-13T13:32:59.929277756Z" level=info msg="RemovePodSandbox \"f18d0cabcde498f039b601d79474ec465e24fc76f2d4830fe89f6daffe5ae5bd\" returns successfully" Dec 13 13:32:59.930748 containerd[1480]: time="2024-12-13T13:32:59.929877499Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:59.930748 containerd[1480]: time="2024-12-13T13:32:59.930372159Z" level=info msg="TearDown network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" successfully" Dec 13 13:32:59.930748 containerd[1480]: time="2024-12-13T13:32:59.930403920Z" level=info msg="StopPodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" returns successfully" Dec 13 13:32:59.931572 containerd[1480]: time="2024-12-13T13:32:59.931535125Z" level=info msg="RemovePodSandbox for \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:59.931707 containerd[1480]: time="2024-12-13T13:32:59.931684171Z" level=info msg="Forcibly stopping sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\"" Dec 13 13:32:59.931926 containerd[1480]: time="2024-12-13T13:32:59.931855178Z" level=info msg="TearDown network for sandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" successfully" Dec 13 13:32:59.937128 containerd[1480]: time="2024-12-13T13:32:59.936994262Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.937826 containerd[1480]: time="2024-12-13T13:32:59.937324035Z" level=info msg="RemovePodSandbox \"e01bce1ccfa349e7b8323598c96c545804b7ebc0992b120c092f4e38cdc9d459\" returns successfully" Dec 13 13:32:59.938448 containerd[1480]: time="2024-12-13T13:32:59.938371156Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" Dec 13 13:32:59.938688 containerd[1480]: time="2024-12-13T13:32:59.938659408Z" level=info msg="TearDown network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" successfully" Dec 13 13:32:59.938859 containerd[1480]: time="2024-12-13T13:32:59.938759531Z" level=info msg="StopPodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" returns successfully" Dec 13 13:32:59.940592 containerd[1480]: time="2024-12-13T13:32:59.939259631Z" level=info msg="RemovePodSandbox for \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" Dec 13 13:32:59.940592 containerd[1480]: time="2024-12-13T13:32:59.939292793Z" level=info msg="Forcibly stopping sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\"" Dec 13 13:32:59.940592 containerd[1480]: time="2024-12-13T13:32:59.939454359Z" level=info msg="TearDown network for sandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" successfully" Dec 13 13:32:59.944033 containerd[1480]: time="2024-12-13T13:32:59.943985939Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.944405 containerd[1480]: time="2024-12-13T13:32:59.944309271Z" level=info msg="RemovePodSandbox \"a9b38c169bf3329bf4e7a4dd4fedb901e37e25ff90f43c34f252f3bb6b94fb80\" returns successfully" Dec 13 13:32:59.944995 containerd[1480]: time="2024-12-13T13:32:59.944960617Z" level=info msg="StopPodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\"" Dec 13 13:32:59.945145 containerd[1480]: time="2024-12-13T13:32:59.945079062Z" level=info msg="TearDown network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" successfully" Dec 13 13:32:59.945145 containerd[1480]: time="2024-12-13T13:32:59.945090342Z" level=info msg="StopPodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" returns successfully" Dec 13 13:32:59.946280 containerd[1480]: time="2024-12-13T13:32:59.946083742Z" level=info msg="RemovePodSandbox for \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\"" Dec 13 13:32:59.946280 containerd[1480]: time="2024-12-13T13:32:59.946119303Z" level=info msg="Forcibly stopping sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\"" Dec 13 13:32:59.946280 containerd[1480]: time="2024-12-13T13:32:59.946213467Z" level=info msg="TearDown network for sandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" successfully" Dec 13 13:32:59.951559 containerd[1480]: time="2024-12-13T13:32:59.951092500Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.951559 containerd[1480]: time="2024-12-13T13:32:59.951189744Z" level=info msg="RemovePodSandbox \"0e68bc5bb0faa88447551e7851618151ae0cd8c77a64ea50384f9193c22e0b7d\" returns successfully" Dec 13 13:32:59.952582 containerd[1480]: time="2024-12-13T13:32:59.952531237Z" level=info msg="StopPodSandbox for \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\"" Dec 13 13:32:59.953055 containerd[1480]: time="2024-12-13T13:32:59.953027537Z" level=info msg="TearDown network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\" successfully" Dec 13 13:32:59.953169 containerd[1480]: time="2024-12-13T13:32:59.953144022Z" level=info msg="StopPodSandbox for \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\" returns successfully" Dec 13 13:32:59.953905 containerd[1480]: time="2024-12-13T13:32:59.953871050Z" level=info msg="RemovePodSandbox for \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\"" Dec 13 13:32:59.954147 containerd[1480]: time="2024-12-13T13:32:59.954127581Z" level=info msg="Forcibly stopping sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\"" Dec 13 13:32:59.954367 containerd[1480]: time="2024-12-13T13:32:59.954345149Z" level=info msg="TearDown network for sandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\" successfully" Dec 13 13:32:59.965767 containerd[1480]: time="2024-12-13T13:32:59.965719280Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Dec 13 13:32:59.965982 containerd[1480]: time="2024-12-13T13:32:59.965964250Z" level=info msg="RemovePodSandbox \"b66ab7df5c24836a1e1fe4ce7d632fd1e7be6d845ba0fc757d9b068bc830918e\" returns successfully" Dec 13 13:33:10.891546 update_engine[1466]: I20241213 13:33:10.889456 1466 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 13 13:33:10.891546 update_engine[1466]: I20241213 13:33:10.889534 1466 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 13 13:33:10.891546 update_engine[1466]: I20241213 13:33:10.889837 1466 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 13 13:33:10.893760 update_engine[1466]: I20241213 13:33:10.893613 1466 omaha_request_params.cc:62] Current group set to alpha Dec 13 13:33:10.895141 update_engine[1466]: I20241213 13:33:10.894873 1466 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 13 13:33:10.895141 update_engine[1466]: I20241213 13:33:10.894924 1466 update_attempter.cc:643] Scheduling an action processor start. Dec 13 13:33:10.895141 update_engine[1466]: I20241213 13:33:10.894946 1466 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 13 13:33:10.895141 update_engine[1466]: I20241213 13:33:10.894989 1466 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 13 13:33:10.895141 update_engine[1466]: I20241213 13:33:10.895062 1466 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 13 13:33:10.896854 update_engine[1466]: I20241213 13:33:10.895069 1466 omaha_request_action.cc:272] Request: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: Dec 13 13:33:10.896854 update_engine[1466]: I20241213 13:33:10.895757 1466 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 13 13:33:10.899478 update_engine[1466]: I20241213 13:33:10.899435 1466 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 13 13:33:10.900017 update_engine[1466]: I20241213 13:33:10.899986 1466 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 13 13:33:10.901534 update_engine[1466]: E20241213 13:33:10.901109 1466 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 13 13:33:10.901778 update_engine[1466]: I20241213 13:33:10.901749 1466 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 13 13:33:10.905614 locksmithd[1501]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 13 13:33:18.343591 kubelet[2825]: I1213 13:33:18.342948 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 13 13:33:20.882186 update_engine[1466]: I20241213 13:33:20.881404 1466 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 13 13:33:20.882186 update_engine[1466]: I20241213 13:33:20.881748 1466 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 13 13:33:20.883449 update_engine[1466]: I20241213 13:33:20.883034 1466 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 13 13:33:20.883449 update_engine[1466]: E20241213 13:33:20.883285 1466 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 13 13:33:20.883449 update_engine[1466]: I20241213 13:33:20.883416 1466 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 13 13:33:30.882282 update_engine[1466]: I20241213 13:33:30.881908 1466 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 13 13:33:30.882282 update_engine[1466]: I20241213 13:33:30.882140 1466 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 13 13:33:30.884296 update_engine[1466]: I20241213 13:33:30.883806 1466 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 13 13:33:30.884774 update_engine[1466]: E20241213 13:33:30.884685 1466 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 13 13:33:30.884774 update_engine[1466]: I20241213 13:33:30.884749 1466 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 13 13:33:40.882360 update_engine[1466]: I20241213 13:33:40.881824 1466 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 13 13:33:40.882360 update_engine[1466]: I20241213 13:33:40.882244 1466 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 13 13:33:40.883481 update_engine[1466]: I20241213 13:33:40.883342 1466 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 13 13:33:40.884556 update_engine[1466]: E20241213 13:33:40.883692 1466 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883764 1466 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883777 1466 omaha_request_action.cc:617] Omaha request response: Dec 13 13:33:40.884556 update_engine[1466]: E20241213 13:33:40.883881 1466 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883908 1466 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883919 1466 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883929 1466 update_attempter.cc:306] Processing Done. Dec 13 13:33:40.884556 update_engine[1466]: E20241213 13:33:40.883950 1466 update_attempter.cc:619] Update failed. Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883960 1466 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883975 1466 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.883986 1466 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.884094 1466 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.884130 1466 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 13 13:33:40.884556 update_engine[1466]: I20241213 13:33:40.884137 1466 omaha_request_action.cc:272] Request: Dec 13 13:33:40.884556 update_engine[1466]: Dec 13 13:33:40.884556 update_engine[1466]: Dec 13 13:33:40.884556 update_engine[1466]: Dec 13 13:33:40.885230 update_engine[1466]: Dec 13 13:33:40.885230 update_engine[1466]: Dec 13 13:33:40.885230 update_engine[1466]: Dec 13 13:33:40.885230 update_engine[1466]: I20241213 13:33:40.884146 1466 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 13 13:33:40.885230 update_engine[1466]: I20241213 13:33:40.884421 1466 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 13 13:33:40.885391 locksmithd[1501]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 13 13:33:40.886349 update_engine[1466]: I20241213 13:33:40.885943 1466 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 13 13:33:40.886776 update_engine[1466]: E20241213 13:33:40.886480 1466 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886569 1466 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886581 1466 omaha_request_action.cc:617] Omaha request response: Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886590 1466 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886598 1466 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886608 1466 update_attempter.cc:306] Processing Done. Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886617 1466 update_attempter.cc:310] Error event sent. Dec 13 13:33:40.886776 update_engine[1466]: I20241213 13:33:40.886631 1466 update_check_scheduler.cc:74] Next update check in 44m57s Dec 13 13:33:40.887344 locksmithd[1501]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 13 13:35:00.876888 systemd[1]: run-containerd-runc-k8s.io-df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f-runc.t7XbHU.mount: Deactivated successfully. Dec 13 13:36:52.176489 systemd[1]: Started sshd@7-188.245.225.138:22-147.75.109.163:40698.service - OpenSSH per-connection server daemon (147.75.109.163:40698). Dec 13 13:36:53.202305 sshd[6322]: Accepted publickey for core from 147.75.109.163 port 40698 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:36:53.204391 sshd-session[6322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:36:53.210979 systemd-logind[1464]: New session 8 of user core. Dec 13 13:36:53.215775 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 13 13:36:53.988121 sshd[6324]: Connection closed by 147.75.109.163 port 40698 Dec 13 13:36:53.988673 sshd-session[6322]: pam_unix(sshd:session): session closed for user core Dec 13 13:36:53.995428 systemd[1]: sshd@7-188.245.225.138:22-147.75.109.163:40698.service: Deactivated successfully. Dec 13 13:36:54.000131 systemd[1]: session-8.scope: Deactivated successfully. Dec 13 13:36:54.002550 systemd-logind[1464]: Session 8 logged out. Waiting for processes to exit. Dec 13 13:36:54.003638 systemd-logind[1464]: Removed session 8. Dec 13 13:36:59.159003 systemd[1]: Started sshd@8-188.245.225.138:22-147.75.109.163:50096.service - OpenSSH per-connection server daemon (147.75.109.163:50096). Dec 13 13:37:00.142484 sshd[6337]: Accepted publickey for core from 147.75.109.163 port 50096 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:00.144797 sshd-session[6337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:00.150642 systemd-logind[1464]: New session 9 of user core. Dec 13 13:37:00.162069 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 13 13:37:00.912279 sshd[6341]: Connection closed by 147.75.109.163 port 50096 Dec 13 13:37:00.911162 sshd-session[6337]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:00.919946 systemd-logind[1464]: Session 9 logged out. Waiting for processes to exit. Dec 13 13:37:00.920424 systemd[1]: sshd@8-188.245.225.138:22-147.75.109.163:50096.service: Deactivated successfully. Dec 13 13:37:00.926884 systemd[1]: session-9.scope: Deactivated successfully. Dec 13 13:37:00.929808 systemd-logind[1464]: Removed session 9. Dec 13 13:37:06.089993 systemd[1]: Started sshd@9-188.245.225.138:22-147.75.109.163:50098.service - OpenSSH per-connection server daemon (147.75.109.163:50098). Dec 13 13:37:07.084813 sshd[6393]: Accepted publickey for core from 147.75.109.163 port 50098 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:07.088583 sshd-session[6393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:07.098337 systemd-logind[1464]: New session 10 of user core. Dec 13 13:37:07.100746 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 13 13:37:07.866028 sshd[6395]: Connection closed by 147.75.109.163 port 50098 Dec 13 13:37:07.865928 sshd-session[6393]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:07.872930 systemd[1]: sshd@9-188.245.225.138:22-147.75.109.163:50098.service: Deactivated successfully. Dec 13 13:37:07.876352 systemd[1]: session-10.scope: Deactivated successfully. Dec 13 13:37:07.877970 systemd-logind[1464]: Session 10 logged out. Waiting for processes to exit. Dec 13 13:37:07.879808 systemd-logind[1464]: Removed session 10. Dec 13 13:37:13.041977 systemd[1]: Started sshd@10-188.245.225.138:22-147.75.109.163:49970.service - OpenSSH per-connection server daemon (147.75.109.163:49970). Dec 13 13:37:14.017306 sshd[6426]: Accepted publickey for core from 147.75.109.163 port 49970 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:14.019690 sshd-session[6426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:14.025354 systemd-logind[1464]: New session 11 of user core. Dec 13 13:37:14.034782 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 13 13:37:14.782652 sshd[6428]: Connection closed by 147.75.109.163 port 49970 Dec 13 13:37:14.783487 sshd-session[6426]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:14.789890 systemd[1]: sshd@10-188.245.225.138:22-147.75.109.163:49970.service: Deactivated successfully. Dec 13 13:37:14.795768 systemd[1]: session-11.scope: Deactivated successfully. Dec 13 13:37:14.800041 systemd-logind[1464]: Session 11 logged out. Waiting for processes to exit. Dec 13 13:37:14.802694 systemd-logind[1464]: Removed session 11. Dec 13 13:37:19.964201 systemd[1]: Started sshd@11-188.245.225.138:22-147.75.109.163:40626.service - OpenSSH per-connection server daemon (147.75.109.163:40626). Dec 13 13:37:20.957914 sshd[6442]: Accepted publickey for core from 147.75.109.163 port 40626 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:20.959676 sshd-session[6442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:20.967868 systemd-logind[1464]: New session 12 of user core. Dec 13 13:37:20.972749 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 13 13:37:21.762227 sshd[6444]: Connection closed by 147.75.109.163 port 40626 Dec 13 13:37:21.761669 sshd-session[6442]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:21.767406 systemd[1]: sshd@11-188.245.225.138:22-147.75.109.163:40626.service: Deactivated successfully. Dec 13 13:37:21.773241 systemd[1]: session-12.scope: Deactivated successfully. Dec 13 13:37:21.777641 systemd-logind[1464]: Session 12 logged out. Waiting for processes to exit. Dec 13 13:37:21.781455 systemd-logind[1464]: Removed session 12. Dec 13 13:37:26.942691 systemd[1]: Started sshd@12-188.245.225.138:22-147.75.109.163:60714.service - OpenSSH per-connection server daemon (147.75.109.163:60714). Dec 13 13:37:27.917609 sshd[6473]: Accepted publickey for core from 147.75.109.163 port 60714 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:27.918745 sshd-session[6473]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:27.925805 systemd-logind[1464]: New session 13 of user core. Dec 13 13:37:27.930080 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 13 13:37:28.672430 sshd[6475]: Connection closed by 147.75.109.163 port 60714 Dec 13 13:37:28.673489 sshd-session[6473]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:28.678970 systemd[1]: sshd@12-188.245.225.138:22-147.75.109.163:60714.service: Deactivated successfully. Dec 13 13:37:28.682051 systemd[1]: session-13.scope: Deactivated successfully. Dec 13 13:37:28.684591 systemd-logind[1464]: Session 13 logged out. Waiting for processes to exit. Dec 13 13:37:28.687166 systemd-logind[1464]: Removed session 13. Dec 13 13:37:33.847948 systemd[1]: Started sshd@13-188.245.225.138:22-147.75.109.163:60718.service - OpenSSH per-connection server daemon (147.75.109.163:60718). Dec 13 13:37:34.843450 sshd[6526]: Accepted publickey for core from 147.75.109.163 port 60718 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:34.846639 sshd-session[6526]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:34.853185 systemd-logind[1464]: New session 14 of user core. Dec 13 13:37:34.857842 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 13 13:37:35.619760 sshd[6528]: Connection closed by 147.75.109.163 port 60718 Dec 13 13:37:35.619732 sshd-session[6526]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:35.624235 systemd[1]: sshd@13-188.245.225.138:22-147.75.109.163:60718.service: Deactivated successfully. Dec 13 13:37:35.627521 systemd[1]: session-14.scope: Deactivated successfully. Dec 13 13:37:35.631118 systemd-logind[1464]: Session 14 logged out. Waiting for processes to exit. Dec 13 13:37:35.633054 systemd-logind[1464]: Removed session 14. Dec 13 13:37:40.794091 systemd[1]: Started sshd@14-188.245.225.138:22-147.75.109.163:35138.service - OpenSSH per-connection server daemon (147.75.109.163:35138). Dec 13 13:37:41.773787 sshd[6540]: Accepted publickey for core from 147.75.109.163 port 35138 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:41.775104 sshd-session[6540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:41.779934 systemd-logind[1464]: New session 15 of user core. Dec 13 13:37:41.787779 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 13 13:37:42.530670 sshd[6543]: Connection closed by 147.75.109.163 port 35138 Dec 13 13:37:42.531458 sshd-session[6540]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:42.538279 systemd[1]: sshd@14-188.245.225.138:22-147.75.109.163:35138.service: Deactivated successfully. Dec 13 13:37:42.542745 systemd[1]: session-15.scope: Deactivated successfully. Dec 13 13:37:42.546800 systemd-logind[1464]: Session 15 logged out. Waiting for processes to exit. Dec 13 13:37:42.548192 systemd-logind[1464]: Removed session 15. Dec 13 13:37:47.711767 systemd[1]: Started sshd@15-188.245.225.138:22-147.75.109.163:60302.service - OpenSSH per-connection server daemon (147.75.109.163:60302). Dec 13 13:37:48.705057 sshd[6557]: Accepted publickey for core from 147.75.109.163 port 60302 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:48.705775 sshd-session[6557]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:48.713053 systemd-logind[1464]: New session 16 of user core. Dec 13 13:37:48.717821 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 13 13:37:49.464115 sshd[6559]: Connection closed by 147.75.109.163 port 60302 Dec 13 13:37:49.463979 sshd-session[6557]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:49.468232 systemd[1]: sshd@15-188.245.225.138:22-147.75.109.163:60302.service: Deactivated successfully. Dec 13 13:37:49.471264 systemd[1]: session-16.scope: Deactivated successfully. Dec 13 13:37:49.474620 systemd-logind[1464]: Session 16 logged out. Waiting for processes to exit. Dec 13 13:37:49.476688 systemd-logind[1464]: Removed session 16. Dec 13 13:37:54.641065 systemd[1]: Started sshd@16-188.245.225.138:22-147.75.109.163:60312.service - OpenSSH per-connection server daemon (147.75.109.163:60312). Dec 13 13:37:55.637844 sshd[6572]: Accepted publickey for core from 147.75.109.163 port 60312 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:37:55.642075 sshd-session[6572]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:37:55.648240 systemd-logind[1464]: New session 17 of user core. Dec 13 13:37:55.654169 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 13 13:37:56.396882 sshd[6574]: Connection closed by 147.75.109.163 port 60312 Dec 13 13:37:56.396783 sshd-session[6572]: pam_unix(sshd:session): session closed for user core Dec 13 13:37:56.403058 systemd-logind[1464]: Session 17 logged out. Waiting for processes to exit. Dec 13 13:37:56.403371 systemd[1]: sshd@16-188.245.225.138:22-147.75.109.163:60312.service: Deactivated successfully. Dec 13 13:37:56.406869 systemd[1]: session-17.scope: Deactivated successfully. Dec 13 13:37:56.411034 systemd-logind[1464]: Removed session 17. Dec 13 13:38:01.572089 systemd[1]: Started sshd@17-188.245.225.138:22-147.75.109.163:34286.service - OpenSSH per-connection server daemon (147.75.109.163:34286). Dec 13 13:38:02.551128 sshd[6609]: Accepted publickey for core from 147.75.109.163 port 34286 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:02.552834 sshd-session[6609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:02.559322 systemd-logind[1464]: New session 18 of user core. Dec 13 13:38:02.564823 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 13 13:38:03.319669 sshd[6611]: Connection closed by 147.75.109.163 port 34286 Dec 13 13:38:03.320422 sshd-session[6609]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:03.326958 systemd-logind[1464]: Session 18 logged out. Waiting for processes to exit. Dec 13 13:38:03.327976 systemd[1]: sshd@17-188.245.225.138:22-147.75.109.163:34286.service: Deactivated successfully. Dec 13 13:38:03.331946 systemd[1]: session-18.scope: Deactivated successfully. Dec 13 13:38:03.334989 systemd-logind[1464]: Removed session 18. Dec 13 13:38:08.501942 systemd[1]: Started sshd@18-188.245.225.138:22-147.75.109.163:38874.service - OpenSSH per-connection server daemon (147.75.109.163:38874). Dec 13 13:38:09.498003 sshd[6650]: Accepted publickey for core from 147.75.109.163 port 38874 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:09.498648 sshd-session[6650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:09.515968 systemd-logind[1464]: New session 19 of user core. Dec 13 13:38:09.523790 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 13 13:38:10.259297 sshd[6670]: Connection closed by 147.75.109.163 port 38874 Dec 13 13:38:10.260095 sshd-session[6650]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:10.265629 systemd[1]: sshd@18-188.245.225.138:22-147.75.109.163:38874.service: Deactivated successfully. Dec 13 13:38:10.270256 systemd[1]: session-19.scope: Deactivated successfully. Dec 13 13:38:10.271574 systemd-logind[1464]: Session 19 logged out. Waiting for processes to exit. Dec 13 13:38:10.272846 systemd-logind[1464]: Removed session 19. Dec 13 13:38:15.436049 systemd[1]: Started sshd@19-188.245.225.138:22-147.75.109.163:38880.service - OpenSSH per-connection server daemon (147.75.109.163:38880). Dec 13 13:38:16.418490 sshd[6681]: Accepted publickey for core from 147.75.109.163 port 38880 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:16.422058 sshd-session[6681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:16.428226 systemd-logind[1464]: New session 20 of user core. Dec 13 13:38:16.434751 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 13 13:38:17.194090 sshd[6685]: Connection closed by 147.75.109.163 port 38880 Dec 13 13:38:17.195861 sshd-session[6681]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:17.200747 systemd[1]: sshd@19-188.245.225.138:22-147.75.109.163:38880.service: Deactivated successfully. Dec 13 13:38:17.204876 systemd[1]: session-20.scope: Deactivated successfully. Dec 13 13:38:17.206545 systemd-logind[1464]: Session 20 logged out. Waiting for processes to exit. Dec 13 13:38:17.207706 systemd-logind[1464]: Removed session 20. Dec 13 13:38:22.377883 systemd[1]: Started sshd@20-188.245.225.138:22-147.75.109.163:35084.service - OpenSSH per-connection server daemon (147.75.109.163:35084). Dec 13 13:38:23.380426 sshd[6697]: Accepted publickey for core from 147.75.109.163 port 35084 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:23.383037 sshd-session[6697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:23.389435 systemd-logind[1464]: New session 21 of user core. Dec 13 13:38:23.392786 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 13 13:38:24.160088 sshd[6699]: Connection closed by 147.75.109.163 port 35084 Dec 13 13:38:24.159404 sshd-session[6697]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:24.165294 systemd-logind[1464]: Session 21 logged out. Waiting for processes to exit. Dec 13 13:38:24.166176 systemd[1]: sshd@20-188.245.225.138:22-147.75.109.163:35084.service: Deactivated successfully. Dec 13 13:38:24.169460 systemd[1]: session-21.scope: Deactivated successfully. Dec 13 13:38:24.170945 systemd-logind[1464]: Removed session 21. Dec 13 13:38:29.335023 systemd[1]: Started sshd@21-188.245.225.138:22-147.75.109.163:45954.service - OpenSSH per-connection server daemon (147.75.109.163:45954). Dec 13 13:38:30.313390 sshd[6712]: Accepted publickey for core from 147.75.109.163 port 45954 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:30.315692 sshd-session[6712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:30.321473 systemd-logind[1464]: New session 22 of user core. Dec 13 13:38:30.325965 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 13 13:38:30.860337 systemd[1]: run-containerd-runc-k8s.io-df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f-runc.bGYdbE.mount: Deactivated successfully. Dec 13 13:38:31.100415 sshd[6714]: Connection closed by 147.75.109.163 port 45954 Dec 13 13:38:31.100074 sshd-session[6712]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:31.104388 systemd-logind[1464]: Session 22 logged out. Waiting for processes to exit. Dec 13 13:38:31.104710 systemd[1]: sshd@21-188.245.225.138:22-147.75.109.163:45954.service: Deactivated successfully. Dec 13 13:38:31.108040 systemd[1]: session-22.scope: Deactivated successfully. Dec 13 13:38:31.110588 systemd-logind[1464]: Removed session 22. Dec 13 13:38:36.273645 systemd[1]: Started sshd@22-188.245.225.138:22-147.75.109.163:52316.service - OpenSSH per-connection server daemon (147.75.109.163:52316). Dec 13 13:38:37.272242 sshd[6766]: Accepted publickey for core from 147.75.109.163 port 52316 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:37.279402 sshd-session[6766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:37.295359 systemd-logind[1464]: New session 23 of user core. Dec 13 13:38:37.298899 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 13 13:38:38.038122 sshd[6768]: Connection closed by 147.75.109.163 port 52316 Dec 13 13:38:38.037820 sshd-session[6766]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:38.046392 systemd[1]: sshd@22-188.245.225.138:22-147.75.109.163:52316.service: Deactivated successfully. Dec 13 13:38:38.048967 systemd[1]: session-23.scope: Deactivated successfully. Dec 13 13:38:38.051031 systemd-logind[1464]: Session 23 logged out. Waiting for processes to exit. Dec 13 13:38:38.052988 systemd-logind[1464]: Removed session 23. Dec 13 13:38:43.212969 systemd[1]: Started sshd@23-188.245.225.138:22-147.75.109.163:52326.service - OpenSSH per-connection server daemon (147.75.109.163:52326). Dec 13 13:38:44.202167 sshd[6780]: Accepted publickey for core from 147.75.109.163 port 52326 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:44.205148 sshd-session[6780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:44.212869 systemd-logind[1464]: New session 24 of user core. Dec 13 13:38:44.218815 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 13 13:38:44.972617 sshd[6782]: Connection closed by 147.75.109.163 port 52326 Dec 13 13:38:44.973514 sshd-session[6780]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:44.979317 systemd[1]: sshd@23-188.245.225.138:22-147.75.109.163:52326.service: Deactivated successfully. Dec 13 13:38:44.979935 systemd-logind[1464]: Session 24 logged out. Waiting for processes to exit. Dec 13 13:38:44.982249 systemd[1]: session-24.scope: Deactivated successfully. Dec 13 13:38:44.983568 systemd-logind[1464]: Removed session 24. Dec 13 13:38:50.146968 systemd[1]: Started sshd@24-188.245.225.138:22-147.75.109.163:51432.service - OpenSSH per-connection server daemon (147.75.109.163:51432). Dec 13 13:38:51.159511 sshd[6797]: Accepted publickey for core from 147.75.109.163 port 51432 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:51.160032 sshd-session[6797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:51.165254 systemd-logind[1464]: New session 25 of user core. Dec 13 13:38:51.172820 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 13 13:38:51.949806 sshd[6799]: Connection closed by 147.75.109.163 port 51432 Dec 13 13:38:51.949683 sshd-session[6797]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:51.953781 systemd[1]: sshd@24-188.245.225.138:22-147.75.109.163:51432.service: Deactivated successfully. Dec 13 13:38:51.956435 systemd[1]: session-25.scope: Deactivated successfully. Dec 13 13:38:51.958151 systemd-logind[1464]: Session 25 logged out. Waiting for processes to exit. Dec 13 13:38:51.959962 systemd-logind[1464]: Removed session 25. Dec 13 13:38:57.128871 systemd[1]: Started sshd@25-188.245.225.138:22-147.75.109.163:56046.service - OpenSSH per-connection server daemon (147.75.109.163:56046). Dec 13 13:38:58.113393 sshd[6816]: Accepted publickey for core from 147.75.109.163 port 56046 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:38:58.114201 sshd-session[6816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:38:58.119053 systemd-logind[1464]: New session 26 of user core. Dec 13 13:38:58.123816 systemd[1]: Started session-26.scope - Session 26 of User core. Dec 13 13:38:58.905409 sshd[6818]: Connection closed by 147.75.109.163 port 56046 Dec 13 13:38:58.904759 sshd-session[6816]: pam_unix(sshd:session): session closed for user core Dec 13 13:38:58.909750 systemd[1]: sshd@25-188.245.225.138:22-147.75.109.163:56046.service: Deactivated successfully. Dec 13 13:38:58.912424 systemd[1]: session-26.scope: Deactivated successfully. Dec 13 13:38:58.914072 systemd-logind[1464]: Session 26 logged out. Waiting for processes to exit. Dec 13 13:38:58.916033 systemd-logind[1464]: Removed session 26. Dec 13 13:39:04.082354 systemd[1]: Started sshd@26-188.245.225.138:22-147.75.109.163:56054.service - OpenSSH per-connection server daemon (147.75.109.163:56054). Dec 13 13:39:05.064191 sshd[6887]: Accepted publickey for core from 147.75.109.163 port 56054 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:05.066983 sshd-session[6887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:05.073485 systemd-logind[1464]: New session 27 of user core. Dec 13 13:39:05.078463 systemd[1]: Started session-27.scope - Session 27 of User core. Dec 13 13:39:05.829756 sshd[6889]: Connection closed by 147.75.109.163 port 56054 Dec 13 13:39:05.830705 sshd-session[6887]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:05.835414 systemd[1]: sshd@26-188.245.225.138:22-147.75.109.163:56054.service: Deactivated successfully. Dec 13 13:39:05.837923 systemd[1]: session-27.scope: Deactivated successfully. Dec 13 13:39:05.839724 systemd-logind[1464]: Session 27 logged out. Waiting for processes to exit. Dec 13 13:39:05.841486 systemd-logind[1464]: Removed session 27. Dec 13 13:39:11.007055 systemd[1]: Started sshd@27-188.245.225.138:22-147.75.109.163:53420.service - OpenSSH per-connection server daemon (147.75.109.163:53420). Dec 13 13:39:11.993918 sshd[6920]: Accepted publickey for core from 147.75.109.163 port 53420 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:11.996122 sshd-session[6920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:12.003908 systemd-logind[1464]: New session 28 of user core. Dec 13 13:39:12.006747 systemd[1]: Started session-28.scope - Session 28 of User core. Dec 13 13:39:12.762535 sshd[6922]: Connection closed by 147.75.109.163 port 53420 Dec 13 13:39:12.762197 sshd-session[6920]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:12.767844 systemd[1]: sshd@27-188.245.225.138:22-147.75.109.163:53420.service: Deactivated successfully. Dec 13 13:39:12.774113 systemd[1]: session-28.scope: Deactivated successfully. Dec 13 13:39:12.776413 systemd-logind[1464]: Session 28 logged out. Waiting for processes to exit. Dec 13 13:39:12.777885 systemd-logind[1464]: Removed session 28. Dec 13 13:39:17.939778 systemd[1]: Started sshd@28-188.245.225.138:22-147.75.109.163:59968.service - OpenSSH per-connection server daemon (147.75.109.163:59968). Dec 13 13:39:18.957171 sshd[6937]: Accepted publickey for core from 147.75.109.163 port 59968 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:18.959481 sshd-session[6937]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:18.965751 systemd-logind[1464]: New session 29 of user core. Dec 13 13:39:18.972150 systemd[1]: Started session-29.scope - Session 29 of User core. Dec 13 13:39:19.738797 sshd[6939]: Connection closed by 147.75.109.163 port 59968 Dec 13 13:39:19.738746 sshd-session[6937]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:19.743331 systemd[1]: sshd@28-188.245.225.138:22-147.75.109.163:59968.service: Deactivated successfully. Dec 13 13:39:19.749608 systemd[1]: session-29.scope: Deactivated successfully. Dec 13 13:39:19.753621 systemd-logind[1464]: Session 29 logged out. Waiting for processes to exit. Dec 13 13:39:19.755134 systemd-logind[1464]: Removed session 29. Dec 13 13:39:24.909956 systemd[1]: Started sshd@29-188.245.225.138:22-147.75.109.163:59976.service - OpenSSH per-connection server daemon (147.75.109.163:59976). Dec 13 13:39:25.923455 sshd[6951]: Accepted publickey for core from 147.75.109.163 port 59976 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:25.925488 sshd-session[6951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:25.930816 systemd-logind[1464]: New session 30 of user core. Dec 13 13:39:25.935843 systemd[1]: Started session-30.scope - Session 30 of User core. Dec 13 13:39:26.695459 sshd[6953]: Connection closed by 147.75.109.163 port 59976 Dec 13 13:39:26.695327 sshd-session[6951]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:26.703243 systemd[1]: sshd@29-188.245.225.138:22-147.75.109.163:59976.service: Deactivated successfully. Dec 13 13:39:26.706143 systemd[1]: session-30.scope: Deactivated successfully. Dec 13 13:39:26.707086 systemd-logind[1464]: Session 30 logged out. Waiting for processes to exit. Dec 13 13:39:26.708419 systemd-logind[1464]: Removed session 30. Dec 13 13:39:31.875907 systemd[1]: Started sshd@30-188.245.225.138:22-147.75.109.163:35566.service - OpenSSH per-connection server daemon (147.75.109.163:35566). Dec 13 13:39:32.861361 sshd[6986]: Accepted publickey for core from 147.75.109.163 port 35566 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:32.864791 sshd-session[6986]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:32.871407 systemd-logind[1464]: New session 31 of user core. Dec 13 13:39:32.875848 systemd[1]: Started session-31.scope - Session 31 of User core. Dec 13 13:39:33.650832 sshd[6988]: Connection closed by 147.75.109.163 port 35566 Dec 13 13:39:33.651295 sshd-session[6986]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:33.658093 systemd[1]: sshd@30-188.245.225.138:22-147.75.109.163:35566.service: Deactivated successfully. Dec 13 13:39:33.661974 systemd[1]: session-31.scope: Deactivated successfully. Dec 13 13:39:33.664306 systemd-logind[1464]: Session 31 logged out. Waiting for processes to exit. Dec 13 13:39:33.665406 systemd-logind[1464]: Removed session 31. Dec 13 13:39:38.830212 systemd[1]: Started sshd@31-188.245.225.138:22-147.75.109.163:40846.service - OpenSSH per-connection server daemon (147.75.109.163:40846). Dec 13 13:39:39.821472 sshd[7021]: Accepted publickey for core from 147.75.109.163 port 40846 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:39.823993 sshd-session[7021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:39.830888 systemd-logind[1464]: New session 32 of user core. Dec 13 13:39:39.834743 systemd[1]: Started session-32.scope - Session 32 of User core. Dec 13 13:39:40.589032 sshd[7023]: Connection closed by 147.75.109.163 port 40846 Dec 13 13:39:40.590322 sshd-session[7021]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:40.595739 systemd-logind[1464]: Session 32 logged out. Waiting for processes to exit. Dec 13 13:39:40.595808 systemd[1]: sshd@31-188.245.225.138:22-147.75.109.163:40846.service: Deactivated successfully. Dec 13 13:39:40.600227 systemd[1]: session-32.scope: Deactivated successfully. Dec 13 13:39:40.603483 systemd-logind[1464]: Removed session 32. Dec 13 13:39:45.771520 systemd[1]: Started sshd@32-188.245.225.138:22-147.75.109.163:40852.service - OpenSSH per-connection server daemon (147.75.109.163:40852). Dec 13 13:39:46.754664 sshd[7036]: Accepted publickey for core from 147.75.109.163 port 40852 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:46.756958 sshd-session[7036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:46.762466 systemd-logind[1464]: New session 33 of user core. Dec 13 13:39:46.767779 systemd[1]: Started session-33.scope - Session 33 of User core. Dec 13 13:39:47.516398 sshd[7040]: Connection closed by 147.75.109.163 port 40852 Dec 13 13:39:47.517326 sshd-session[7036]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:47.524291 systemd[1]: sshd@32-188.245.225.138:22-147.75.109.163:40852.service: Deactivated successfully. Dec 13 13:39:47.530893 systemd[1]: session-33.scope: Deactivated successfully. Dec 13 13:39:47.533564 systemd-logind[1464]: Session 33 logged out. Waiting for processes to exit. Dec 13 13:39:47.534763 systemd-logind[1464]: Removed session 33. Dec 13 13:39:52.696924 systemd[1]: Started sshd@33-188.245.225.138:22-147.75.109.163:51480.service - OpenSSH per-connection server daemon (147.75.109.163:51480). Dec 13 13:39:53.677595 sshd[7052]: Accepted publickey for core from 147.75.109.163 port 51480 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:39:53.679890 sshd-session[7052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:39:53.690438 systemd-logind[1464]: New session 34 of user core. Dec 13 13:39:53.694896 systemd[1]: Started session-34.scope - Session 34 of User core. Dec 13 13:39:54.433470 sshd[7057]: Connection closed by 147.75.109.163 port 51480 Dec 13 13:39:54.434956 sshd-session[7052]: pam_unix(sshd:session): session closed for user core Dec 13 13:39:54.441617 systemd[1]: sshd@33-188.245.225.138:22-147.75.109.163:51480.service: Deactivated successfully. Dec 13 13:39:54.444336 systemd-logind[1464]: Session 34 logged out. Waiting for processes to exit. Dec 13 13:39:54.445439 systemd[1]: session-34.scope: Deactivated successfully. Dec 13 13:39:54.449085 systemd-logind[1464]: Removed session 34. Dec 13 13:39:59.617992 systemd[1]: Started sshd@34-188.245.225.138:22-147.75.109.163:46318.service - OpenSSH per-connection server daemon (147.75.109.163:46318). Dec 13 13:40:00.607041 sshd[7068]: Accepted publickey for core from 147.75.109.163 port 46318 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:00.610143 sshd-session[7068]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:00.620659 systemd-logind[1464]: New session 35 of user core. Dec 13 13:40:00.623798 systemd[1]: Started session-35.scope - Session 35 of User core. Dec 13 13:40:01.378357 sshd[7072]: Connection closed by 147.75.109.163 port 46318 Dec 13 13:40:01.379404 sshd-session[7068]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:01.383691 systemd[1]: sshd@34-188.245.225.138:22-147.75.109.163:46318.service: Deactivated successfully. Dec 13 13:40:01.388329 systemd[1]: session-35.scope: Deactivated successfully. Dec 13 13:40:01.392034 systemd-logind[1464]: Session 35 logged out. Waiting for processes to exit. Dec 13 13:40:01.393541 systemd-logind[1464]: Removed session 35. Dec 13 13:40:06.557961 systemd[1]: Started sshd@35-188.245.225.138:22-147.75.109.163:49936.service - OpenSSH per-connection server daemon (147.75.109.163:49936). Dec 13 13:40:07.565165 sshd[7128]: Accepted publickey for core from 147.75.109.163 port 49936 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:07.565897 sshd-session[7128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:07.573382 systemd-logind[1464]: New session 36 of user core. Dec 13 13:40:07.577852 systemd[1]: Started session-36.scope - Session 36 of User core. Dec 13 13:40:08.331701 sshd[7130]: Connection closed by 147.75.109.163 port 49936 Dec 13 13:40:08.334591 sshd-session[7128]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:08.338779 systemd[1]: sshd@35-188.245.225.138:22-147.75.109.163:49936.service: Deactivated successfully. Dec 13 13:40:08.342674 systemd[1]: session-36.scope: Deactivated successfully. Dec 13 13:40:08.345066 systemd-logind[1464]: Session 36 logged out. Waiting for processes to exit. Dec 13 13:40:08.347320 systemd-logind[1464]: Removed session 36. Dec 13 13:40:13.522351 systemd[1]: Started sshd@36-188.245.225.138:22-147.75.109.163:49948.service - OpenSSH per-connection server daemon (147.75.109.163:49948). Dec 13 13:40:14.518362 sshd[7160]: Accepted publickey for core from 147.75.109.163 port 49948 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:14.520378 sshd-session[7160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:14.526792 systemd-logind[1464]: New session 37 of user core. Dec 13 13:40:14.530733 systemd[1]: Started session-37.scope - Session 37 of User core. Dec 13 13:40:15.324059 sshd[7162]: Connection closed by 147.75.109.163 port 49948 Dec 13 13:40:15.323828 sshd-session[7160]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:15.330069 systemd[1]: sshd@36-188.245.225.138:22-147.75.109.163:49948.service: Deactivated successfully. Dec 13 13:40:15.332694 systemd[1]: session-37.scope: Deactivated successfully. Dec 13 13:40:15.334414 systemd-logind[1464]: Session 37 logged out. Waiting for processes to exit. Dec 13 13:40:15.335700 systemd-logind[1464]: Removed session 37. Dec 13 13:40:20.497977 systemd[1]: Started sshd@37-188.245.225.138:22-147.75.109.163:37694.service - OpenSSH per-connection server daemon (147.75.109.163:37694). Dec 13 13:40:21.474057 sshd[7176]: Accepted publickey for core from 147.75.109.163 port 37694 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:21.476457 sshd-session[7176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:21.482028 systemd-logind[1464]: New session 38 of user core. Dec 13 13:40:21.493955 systemd[1]: Started session-38.scope - Session 38 of User core. Dec 13 13:40:22.228518 sshd[7178]: Connection closed by 147.75.109.163 port 37694 Dec 13 13:40:22.229470 sshd-session[7176]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:22.235402 systemd[1]: sshd@37-188.245.225.138:22-147.75.109.163:37694.service: Deactivated successfully. Dec 13 13:40:22.239077 systemd[1]: session-38.scope: Deactivated successfully. Dec 13 13:40:22.240290 systemd-logind[1464]: Session 38 logged out. Waiting for processes to exit. Dec 13 13:40:22.242115 systemd-logind[1464]: Removed session 38. Dec 13 13:40:24.423429 systemd[1]: Started sshd@38-188.245.225.138:22-167.99.157.155:47864.service - OpenSSH per-connection server daemon (167.99.157.155:47864). Dec 13 13:40:24.583883 sshd[7190]: Connection closed by 167.99.157.155 port 47864 Dec 13 13:40:24.586418 systemd[1]: sshd@38-188.245.225.138:22-167.99.157.155:47864.service: Deactivated successfully. Dec 13 13:40:27.410704 systemd[1]: Started sshd@39-188.245.225.138:22-147.75.109.163:45670.service - OpenSSH per-connection server daemon (147.75.109.163:45670). Dec 13 13:40:28.409169 sshd[7195]: Accepted publickey for core from 147.75.109.163 port 45670 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:28.410963 sshd-session[7195]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:28.418277 systemd-logind[1464]: New session 39 of user core. Dec 13 13:40:28.422802 systemd[1]: Started session-39.scope - Session 39 of User core. Dec 13 13:40:29.188180 sshd[7202]: Connection closed by 147.75.109.163 port 45670 Dec 13 13:40:29.188953 sshd-session[7195]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:29.194252 systemd-logind[1464]: Session 39 logged out. Waiting for processes to exit. Dec 13 13:40:29.194757 systemd[1]: sshd@39-188.245.225.138:22-147.75.109.163:45670.service: Deactivated successfully. Dec 13 13:40:29.197854 systemd[1]: session-39.scope: Deactivated successfully. Dec 13 13:40:29.199837 systemd-logind[1464]: Removed session 39. Dec 13 13:40:34.361123 systemd[1]: Started sshd@40-188.245.225.138:22-147.75.109.163:45680.service - OpenSSH per-connection server daemon (147.75.109.163:45680). Dec 13 13:40:35.357333 sshd[7267]: Accepted publickey for core from 147.75.109.163 port 45680 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:35.359631 sshd-session[7267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:35.365619 systemd-logind[1464]: New session 40 of user core. Dec 13 13:40:35.372857 systemd[1]: Started session-40.scope - Session 40 of User core. Dec 13 13:40:36.120566 sshd[7269]: Connection closed by 147.75.109.163 port 45680 Dec 13 13:40:36.121956 sshd-session[7267]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:36.127046 systemd[1]: sshd@40-188.245.225.138:22-147.75.109.163:45680.service: Deactivated successfully. Dec 13 13:40:36.129832 systemd[1]: session-40.scope: Deactivated successfully. Dec 13 13:40:36.134034 systemd-logind[1464]: Session 40 logged out. Waiting for processes to exit. Dec 13 13:40:36.137238 systemd-logind[1464]: Removed session 40. Dec 13 13:40:41.304773 systemd[1]: Started sshd@41-188.245.225.138:22-147.75.109.163:54476.service - OpenSSH per-connection server daemon (147.75.109.163:54476). Dec 13 13:40:42.297575 sshd[7281]: Accepted publickey for core from 147.75.109.163 port 54476 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:42.299861 sshd-session[7281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:42.305887 systemd-logind[1464]: New session 41 of user core. Dec 13 13:40:42.312820 systemd[1]: Started session-41.scope - Session 41 of User core. Dec 13 13:40:43.066031 sshd[7283]: Connection closed by 147.75.109.163 port 54476 Dec 13 13:40:43.066568 sshd-session[7281]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:43.073972 systemd[1]: sshd@41-188.245.225.138:22-147.75.109.163:54476.service: Deactivated successfully. Dec 13 13:40:43.073994 systemd-logind[1464]: Session 41 logged out. Waiting for processes to exit. Dec 13 13:40:43.080819 systemd[1]: session-41.scope: Deactivated successfully. Dec 13 13:40:43.083809 systemd-logind[1464]: Removed session 41. Dec 13 13:40:48.242033 systemd[1]: Started sshd@42-188.245.225.138:22-147.75.109.163:44660.service - OpenSSH per-connection server daemon (147.75.109.163:44660). Dec 13 13:40:49.218738 sshd[7297]: Accepted publickey for core from 147.75.109.163 port 44660 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:49.220948 sshd-session[7297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:49.230155 systemd-logind[1464]: New session 42 of user core. Dec 13 13:40:49.237085 systemd[1]: Started session-42.scope - Session 42 of User core. Dec 13 13:40:49.995631 sshd[7299]: Connection closed by 147.75.109.163 port 44660 Dec 13 13:40:49.997140 sshd-session[7297]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:50.003030 systemd[1]: sshd@42-188.245.225.138:22-147.75.109.163:44660.service: Deactivated successfully. Dec 13 13:40:50.008248 systemd[1]: session-42.scope: Deactivated successfully. Dec 13 13:40:50.011005 systemd-logind[1464]: Session 42 logged out. Waiting for processes to exit. Dec 13 13:40:50.012377 systemd-logind[1464]: Removed session 42. Dec 13 13:40:55.171872 systemd[1]: Started sshd@43-188.245.225.138:22-147.75.109.163:44670.service - OpenSSH per-connection server daemon (147.75.109.163:44670). Dec 13 13:40:56.165264 sshd[7311]: Accepted publickey for core from 147.75.109.163 port 44670 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:56.168131 sshd-session[7311]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:56.173681 systemd-logind[1464]: New session 43 of user core. Dec 13 13:40:56.177025 systemd[1]: Started session-43.scope - Session 43 of User core. Dec 13 13:40:56.940741 sshd[7313]: Connection closed by 147.75.109.163 port 44670 Dec 13 13:40:56.941603 sshd-session[7311]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:56.947811 systemd[1]: sshd@43-188.245.225.138:22-147.75.109.163:44670.service: Deactivated successfully. Dec 13 13:40:56.951892 systemd[1]: session-43.scope: Deactivated successfully. Dec 13 13:40:56.954037 systemd-logind[1464]: Session 43 logged out. Waiting for processes to exit. Dec 13 13:40:56.957134 systemd-logind[1464]: Removed session 43. Dec 13 13:40:57.123998 systemd[1]: Started sshd@44-188.245.225.138:22-147.75.109.163:33996.service - OpenSSH per-connection server daemon (147.75.109.163:33996). Dec 13 13:40:58.113555 sshd[7325]: Accepted publickey for core from 147.75.109.163 port 33996 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:40:58.116568 sshd-session[7325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:40:58.123536 systemd-logind[1464]: New session 44 of user core. Dec 13 13:40:58.127808 systemd[1]: Started session-44.scope - Session 44 of User core. Dec 13 13:40:58.962532 sshd[7327]: Connection closed by 147.75.109.163 port 33996 Dec 13 13:40:58.963053 sshd-session[7325]: pam_unix(sshd:session): session closed for user core Dec 13 13:40:58.968343 systemd[1]: sshd@44-188.245.225.138:22-147.75.109.163:33996.service: Deactivated successfully. Dec 13 13:40:58.972930 systemd[1]: session-44.scope: Deactivated successfully. Dec 13 13:40:58.975183 systemd-logind[1464]: Session 44 logged out. Waiting for processes to exit. Dec 13 13:40:58.976896 systemd-logind[1464]: Removed session 44. Dec 13 13:40:59.137090 systemd[1]: Started sshd@45-188.245.225.138:22-147.75.109.163:34002.service - OpenSSH per-connection server daemon (147.75.109.163:34002). Dec 13 13:41:00.114170 sshd[7335]: Accepted publickey for core from 147.75.109.163 port 34002 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:00.115766 sshd-session[7335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:00.125068 systemd-logind[1464]: New session 45 of user core. Dec 13 13:41:00.127740 systemd[1]: Started session-45.scope - Session 45 of User core. Dec 13 13:41:00.881246 sshd[7339]: Connection closed by 147.75.109.163 port 34002 Dec 13 13:41:00.884254 sshd-session[7335]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:00.894882 systemd[1]: sshd@45-188.245.225.138:22-147.75.109.163:34002.service: Deactivated successfully. Dec 13 13:41:00.898327 systemd[1]: session-45.scope: Deactivated successfully. Dec 13 13:41:00.904419 systemd-logind[1464]: Session 45 logged out. Waiting for processes to exit. Dec 13 13:41:00.907482 systemd-logind[1464]: Removed session 45. Dec 13 13:41:06.066904 systemd[1]: Started sshd@46-188.245.225.138:22-147.75.109.163:34016.service - OpenSSH per-connection server daemon (147.75.109.163:34016). Dec 13 13:41:07.058539 sshd[7400]: Accepted publickey for core from 147.75.109.163 port 34016 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:07.060097 sshd-session[7400]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:07.068014 systemd-logind[1464]: New session 46 of user core. Dec 13 13:41:07.074263 systemd[1]: Started session-46.scope - Session 46 of User core. Dec 13 13:41:07.830724 sshd[7402]: Connection closed by 147.75.109.163 port 34016 Dec 13 13:41:07.831491 sshd-session[7400]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:07.836181 systemd[1]: sshd@46-188.245.225.138:22-147.75.109.163:34016.service: Deactivated successfully. Dec 13 13:41:07.840168 systemd[1]: session-46.scope: Deactivated successfully. Dec 13 13:41:07.842009 systemd-logind[1464]: Session 46 logged out. Waiting for processes to exit. Dec 13 13:41:07.843831 systemd-logind[1464]: Removed session 46. Dec 13 13:41:13.012010 systemd[1]: Started sshd@47-188.245.225.138:22-147.75.109.163:35966.service - OpenSSH per-connection server daemon (147.75.109.163:35966). Dec 13 13:41:13.998531 sshd[7431]: Accepted publickey for core from 147.75.109.163 port 35966 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:14.000195 sshd-session[7431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:14.010012 systemd-logind[1464]: New session 47 of user core. Dec 13 13:41:14.011765 systemd[1]: Started session-47.scope - Session 47 of User core. Dec 13 13:41:14.789487 sshd[7433]: Connection closed by 147.75.109.163 port 35966 Dec 13 13:41:14.793199 sshd-session[7431]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:14.798798 systemd[1]: sshd@47-188.245.225.138:22-147.75.109.163:35966.service: Deactivated successfully. Dec 13 13:41:14.806952 systemd[1]: session-47.scope: Deactivated successfully. Dec 13 13:41:14.810432 systemd-logind[1464]: Session 47 logged out. Waiting for processes to exit. Dec 13 13:41:14.811469 systemd-logind[1464]: Removed session 47. Dec 13 13:41:19.974020 systemd[1]: Started sshd@48-188.245.225.138:22-147.75.109.163:39138.service - OpenSSH per-connection server daemon (147.75.109.163:39138). Dec 13 13:41:20.967812 sshd[7446]: Accepted publickey for core from 147.75.109.163 port 39138 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:20.968854 sshd-session[7446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:20.976389 systemd-logind[1464]: New session 48 of user core. Dec 13 13:41:20.984921 systemd[1]: Started session-48.scope - Session 48 of User core. Dec 13 13:41:21.739024 sshd[7448]: Connection closed by 147.75.109.163 port 39138 Dec 13 13:41:21.739762 sshd-session[7446]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:21.749132 systemd[1]: sshd@48-188.245.225.138:22-147.75.109.163:39138.service: Deactivated successfully. Dec 13 13:41:21.755567 systemd[1]: session-48.scope: Deactivated successfully. Dec 13 13:41:21.757200 systemd-logind[1464]: Session 48 logged out. Waiting for processes to exit. Dec 13 13:41:21.760484 systemd-logind[1464]: Removed session 48. Dec 13 13:41:26.911137 systemd[1]: Started sshd@49-188.245.225.138:22-147.75.109.163:40640.service - OpenSSH per-connection server daemon (147.75.109.163:40640). Dec 13 13:41:27.904040 sshd[7459]: Accepted publickey for core from 147.75.109.163 port 40640 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:27.904811 sshd-session[7459]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:27.910254 systemd-logind[1464]: New session 49 of user core. Dec 13 13:41:27.915813 systemd[1]: Started session-49.scope - Session 49 of User core. Dec 13 13:41:28.681148 sshd[7461]: Connection closed by 147.75.109.163 port 40640 Dec 13 13:41:28.683034 sshd-session[7459]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:28.687689 systemd[1]: sshd@49-188.245.225.138:22-147.75.109.163:40640.service: Deactivated successfully. Dec 13 13:41:28.690304 systemd[1]: session-49.scope: Deactivated successfully. Dec 13 13:41:28.693213 systemd-logind[1464]: Session 49 logged out. Waiting for processes to exit. Dec 13 13:41:28.695736 systemd-logind[1464]: Removed session 49. Dec 13 13:41:33.860099 systemd[1]: Started sshd@50-188.245.225.138:22-147.75.109.163:40646.service - OpenSSH per-connection server daemon (147.75.109.163:40646). Dec 13 13:41:34.845787 sshd[7510]: Accepted publickey for core from 147.75.109.163 port 40646 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:34.849464 sshd-session[7510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:34.856241 systemd-logind[1464]: New session 50 of user core. Dec 13 13:41:34.863266 systemd[1]: Started session-50.scope - Session 50 of User core. Dec 13 13:41:35.608814 sshd[7512]: Connection closed by 147.75.109.163 port 40646 Dec 13 13:41:35.609770 sshd-session[7510]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:35.615278 systemd-logind[1464]: Session 50 logged out. Waiting for processes to exit. Dec 13 13:41:35.618668 systemd[1]: sshd@50-188.245.225.138:22-147.75.109.163:40646.service: Deactivated successfully. Dec 13 13:41:35.623202 systemd[1]: session-50.scope: Deactivated successfully. Dec 13 13:41:35.625748 systemd-logind[1464]: Removed session 50. Dec 13 13:41:40.790767 systemd[1]: Started sshd@51-188.245.225.138:22-147.75.109.163:57686.service - OpenSSH per-connection server daemon (147.75.109.163:57686). Dec 13 13:41:41.787368 sshd[7524]: Accepted publickey for core from 147.75.109.163 port 57686 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:41.790305 sshd-session[7524]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:41.800068 systemd-logind[1464]: New session 51 of user core. Dec 13 13:41:41.804875 systemd[1]: Started session-51.scope - Session 51 of User core. Dec 13 13:41:42.556528 sshd[7526]: Connection closed by 147.75.109.163 port 57686 Dec 13 13:41:42.556249 sshd-session[7524]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:42.562777 systemd[1]: sshd@51-188.245.225.138:22-147.75.109.163:57686.service: Deactivated successfully. Dec 13 13:41:42.569131 systemd[1]: session-51.scope: Deactivated successfully. Dec 13 13:41:42.571605 systemd-logind[1464]: Session 51 logged out. Waiting for processes to exit. Dec 13 13:41:42.573229 systemd-logind[1464]: Removed session 51. Dec 13 13:41:44.951870 systemd[1]: Started sshd@52-188.245.225.138:22-159.223.178.117:60042.service - OpenSSH per-connection server daemon (159.223.178.117:60042). Dec 13 13:41:45.094944 sshd[7538]: Connection closed by 159.223.178.117 port 60042 Dec 13 13:41:45.097081 systemd[1]: sshd@52-188.245.225.138:22-159.223.178.117:60042.service: Deactivated successfully. Dec 13 13:41:47.735186 systemd[1]: Started sshd@53-188.245.225.138:22-147.75.109.163:42504.service - OpenSSH per-connection server daemon (147.75.109.163:42504). Dec 13 13:41:48.729808 sshd[7544]: Accepted publickey for core from 147.75.109.163 port 42504 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:48.731838 sshd-session[7544]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:48.740084 systemd-logind[1464]: New session 52 of user core. Dec 13 13:41:48.745214 systemd[1]: Started session-52.scope - Session 52 of User core. Dec 13 13:41:49.495924 sshd[7546]: Connection closed by 147.75.109.163 port 42504 Dec 13 13:41:49.496720 sshd-session[7544]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:49.501894 systemd[1]: sshd@53-188.245.225.138:22-147.75.109.163:42504.service: Deactivated successfully. Dec 13 13:41:49.506173 systemd[1]: session-52.scope: Deactivated successfully. Dec 13 13:41:49.507484 systemd-logind[1464]: Session 52 logged out. Waiting for processes to exit. Dec 13 13:41:49.509243 systemd-logind[1464]: Removed session 52. Dec 13 13:41:54.677896 systemd[1]: Started sshd@54-188.245.225.138:22-147.75.109.163:42506.service - OpenSSH per-connection server daemon (147.75.109.163:42506). Dec 13 13:41:55.676431 sshd[7557]: Accepted publickey for core from 147.75.109.163 port 42506 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:41:55.680331 sshd-session[7557]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:41:55.687551 systemd-logind[1464]: New session 53 of user core. Dec 13 13:41:55.695266 systemd[1]: Started session-53.scope - Session 53 of User core. Dec 13 13:41:56.460452 sshd[7559]: Connection closed by 147.75.109.163 port 42506 Dec 13 13:41:56.462001 sshd-session[7557]: pam_unix(sshd:session): session closed for user core Dec 13 13:41:56.466487 systemd-logind[1464]: Session 53 logged out. Waiting for processes to exit. Dec 13 13:41:56.467349 systemd[1]: sshd@54-188.245.225.138:22-147.75.109.163:42506.service: Deactivated successfully. Dec 13 13:41:56.470760 systemd[1]: session-53.scope: Deactivated successfully. Dec 13 13:41:56.474976 systemd-logind[1464]: Removed session 53. Dec 13 13:42:01.639869 systemd[1]: Started sshd@55-188.245.225.138:22-147.75.109.163:55032.service - OpenSSH per-connection server daemon (147.75.109.163:55032). Dec 13 13:42:02.633566 sshd[7596]: Accepted publickey for core from 147.75.109.163 port 55032 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:02.634888 sshd-session[7596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:02.640816 systemd-logind[1464]: New session 54 of user core. Dec 13 13:42:02.644730 systemd[1]: Started session-54.scope - Session 54 of User core. Dec 13 13:42:03.405867 sshd[7598]: Connection closed by 147.75.109.163 port 55032 Dec 13 13:42:03.406336 sshd-session[7596]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:03.411911 systemd[1]: sshd@55-188.245.225.138:22-147.75.109.163:55032.service: Deactivated successfully. Dec 13 13:42:03.415207 systemd[1]: session-54.scope: Deactivated successfully. Dec 13 13:42:03.416853 systemd-logind[1464]: Session 54 logged out. Waiting for processes to exit. Dec 13 13:42:03.418584 systemd-logind[1464]: Removed session 54. Dec 13 13:42:08.583049 systemd[1]: Started sshd@56-188.245.225.138:22-147.75.109.163:35610.service - OpenSSH per-connection server daemon (147.75.109.163:35610). Dec 13 13:42:09.564916 sshd[7644]: Accepted publickey for core from 147.75.109.163 port 35610 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:09.567776 sshd-session[7644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:09.573555 systemd-logind[1464]: New session 55 of user core. Dec 13 13:42:09.581247 systemd[1]: Started session-55.scope - Session 55 of User core. Dec 13 13:42:10.329952 sshd[7665]: Connection closed by 147.75.109.163 port 35610 Dec 13 13:42:10.331587 sshd-session[7644]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:10.335951 systemd[1]: sshd@56-188.245.225.138:22-147.75.109.163:35610.service: Deactivated successfully. Dec 13 13:42:10.341973 systemd[1]: session-55.scope: Deactivated successfully. Dec 13 13:42:10.343335 systemd-logind[1464]: Session 55 logged out. Waiting for processes to exit. Dec 13 13:42:10.345280 systemd-logind[1464]: Removed session 55. Dec 13 13:42:15.511661 systemd[1]: Started sshd@57-188.245.225.138:22-147.75.109.163:35624.service - OpenSSH per-connection server daemon (147.75.109.163:35624). Dec 13 13:42:16.501336 sshd[7676]: Accepted publickey for core from 147.75.109.163 port 35624 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:16.504357 sshd-session[7676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:16.510243 systemd-logind[1464]: New session 56 of user core. Dec 13 13:42:16.522868 systemd[1]: Started session-56.scope - Session 56 of User core. Dec 13 13:42:17.278139 sshd[7680]: Connection closed by 147.75.109.163 port 35624 Dec 13 13:42:17.276935 sshd-session[7676]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:17.282094 systemd[1]: sshd@57-188.245.225.138:22-147.75.109.163:35624.service: Deactivated successfully. Dec 13 13:42:17.287403 systemd[1]: session-56.scope: Deactivated successfully. Dec 13 13:42:17.292333 systemd-logind[1464]: Session 56 logged out. Waiting for processes to exit. Dec 13 13:42:17.294588 systemd-logind[1464]: Removed session 56. Dec 13 13:42:22.458944 systemd[1]: Started sshd@58-188.245.225.138:22-147.75.109.163:33492.service - OpenSSH per-connection server daemon (147.75.109.163:33492). Dec 13 13:42:23.453863 sshd[7691]: Accepted publickey for core from 147.75.109.163 port 33492 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:23.456230 sshd-session[7691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:23.461733 systemd-logind[1464]: New session 57 of user core. Dec 13 13:42:23.472916 systemd[1]: Started session-57.scope - Session 57 of User core. Dec 13 13:42:24.236524 sshd[7693]: Connection closed by 147.75.109.163 port 33492 Dec 13 13:42:24.235958 sshd-session[7691]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:24.242157 systemd[1]: sshd@58-188.245.225.138:22-147.75.109.163:33492.service: Deactivated successfully. Dec 13 13:42:24.248083 systemd[1]: session-57.scope: Deactivated successfully. Dec 13 13:42:24.250557 systemd-logind[1464]: Session 57 logged out. Waiting for processes to exit. Dec 13 13:42:24.253343 systemd-logind[1464]: Removed session 57. Dec 13 13:42:29.420196 systemd[1]: Started sshd@59-188.245.225.138:22-147.75.109.163:41954.service - OpenSSH per-connection server daemon (147.75.109.163:41954). Dec 13 13:42:30.411543 sshd[7704]: Accepted publickey for core from 147.75.109.163 port 41954 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:30.412871 sshd-session[7704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:30.420612 systemd-logind[1464]: New session 58 of user core. Dec 13 13:42:30.425834 systemd[1]: Started session-58.scope - Session 58 of User core. Dec 13 13:42:30.852943 systemd[1]: run-containerd-runc-k8s.io-df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f-runc.oSGmTn.mount: Deactivated successfully. Dec 13 13:42:31.174404 sshd[7706]: Connection closed by 147.75.109.163 port 41954 Dec 13 13:42:31.178755 sshd-session[7704]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:31.185146 systemd[1]: sshd@59-188.245.225.138:22-147.75.109.163:41954.service: Deactivated successfully. Dec 13 13:42:31.193305 systemd[1]: session-58.scope: Deactivated successfully. Dec 13 13:42:31.195172 systemd-logind[1464]: Session 58 logged out. Waiting for processes to exit. Dec 13 13:42:31.199402 systemd-logind[1464]: Removed session 58. Dec 13 13:42:36.354916 systemd[1]: Started sshd@60-188.245.225.138:22-147.75.109.163:35234.service - OpenSSH per-connection server daemon (147.75.109.163:35234). Dec 13 13:42:37.365389 sshd[7759]: Accepted publickey for core from 147.75.109.163 port 35234 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:37.368245 sshd-session[7759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:37.377850 systemd-logind[1464]: New session 59 of user core. Dec 13 13:42:37.380256 systemd[1]: Started session-59.scope - Session 59 of User core. Dec 13 13:42:38.141446 sshd[7761]: Connection closed by 147.75.109.163 port 35234 Dec 13 13:42:38.142688 sshd-session[7759]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:38.147607 systemd[1]: sshd@60-188.245.225.138:22-147.75.109.163:35234.service: Deactivated successfully. Dec 13 13:42:38.150163 systemd[1]: session-59.scope: Deactivated successfully. Dec 13 13:42:38.153744 systemd-logind[1464]: Session 59 logged out. Waiting for processes to exit. Dec 13 13:42:38.155084 systemd-logind[1464]: Removed session 59. Dec 13 13:42:43.319734 systemd[1]: Started sshd@61-188.245.225.138:22-147.75.109.163:35238.service - OpenSSH per-connection server daemon (147.75.109.163:35238). Dec 13 13:42:44.298357 sshd[7773]: Accepted publickey for core from 147.75.109.163 port 35238 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:44.300205 sshd-session[7773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:44.306680 systemd-logind[1464]: New session 60 of user core. Dec 13 13:42:44.312848 systemd[1]: Started session-60.scope - Session 60 of User core. Dec 13 13:42:45.055957 sshd[7775]: Connection closed by 147.75.109.163 port 35238 Dec 13 13:42:45.055786 sshd-session[7773]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:45.062017 systemd-logind[1464]: Session 60 logged out. Waiting for processes to exit. Dec 13 13:42:45.062487 systemd[1]: sshd@61-188.245.225.138:22-147.75.109.163:35238.service: Deactivated successfully. Dec 13 13:42:45.067395 systemd[1]: session-60.scope: Deactivated successfully. Dec 13 13:42:45.069124 systemd-logind[1464]: Removed session 60. Dec 13 13:42:50.235859 systemd[1]: Started sshd@62-188.245.225.138:22-147.75.109.163:39778.service - OpenSSH per-connection server daemon (147.75.109.163:39778). Dec 13 13:42:51.231258 sshd[7788]: Accepted publickey for core from 147.75.109.163 port 39778 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:51.232127 sshd-session[7788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:51.238068 systemd-logind[1464]: New session 61 of user core. Dec 13 13:42:51.241877 systemd[1]: Started session-61.scope - Session 61 of User core. Dec 13 13:42:51.992292 sshd[7790]: Connection closed by 147.75.109.163 port 39778 Dec 13 13:42:51.994019 sshd-session[7788]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:52.000907 systemd-logind[1464]: Session 61 logged out. Waiting for processes to exit. Dec 13 13:42:52.002143 systemd[1]: sshd@62-188.245.225.138:22-147.75.109.163:39778.service: Deactivated successfully. Dec 13 13:42:52.005074 systemd[1]: session-61.scope: Deactivated successfully. Dec 13 13:42:52.008329 systemd-logind[1464]: Removed session 61. Dec 13 13:42:57.171356 systemd[1]: Started sshd@63-188.245.225.138:22-147.75.109.163:33164.service - OpenSSH per-connection server daemon (147.75.109.163:33164). Dec 13 13:42:58.152620 sshd[7801]: Accepted publickey for core from 147.75.109.163 port 33164 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:42:58.156668 sshd-session[7801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:42:58.162219 systemd-logind[1464]: New session 62 of user core. Dec 13 13:42:58.174850 systemd[1]: Started session-62.scope - Session 62 of User core. Dec 13 13:42:58.921249 sshd[7803]: Connection closed by 147.75.109.163 port 33164 Dec 13 13:42:58.922571 sshd-session[7801]: pam_unix(sshd:session): session closed for user core Dec 13 13:42:58.928234 systemd-logind[1464]: Session 62 logged out. Waiting for processes to exit. Dec 13 13:42:58.929009 systemd[1]: sshd@63-188.245.225.138:22-147.75.109.163:33164.service: Deactivated successfully. Dec 13 13:42:58.934144 systemd[1]: session-62.scope: Deactivated successfully. Dec 13 13:42:58.936671 systemd-logind[1464]: Removed session 62. Dec 13 13:43:04.097986 systemd[1]: Started sshd@64-188.245.225.138:22-147.75.109.163:33176.service - OpenSSH per-connection server daemon (147.75.109.163:33176). Dec 13 13:43:05.094092 sshd[7858]: Accepted publickey for core from 147.75.109.163 port 33176 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:05.097721 sshd-session[7858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:05.108044 systemd-logind[1464]: New session 63 of user core. Dec 13 13:43:05.114750 systemd[1]: Started session-63.scope - Session 63 of User core. Dec 13 13:43:05.868539 sshd[7860]: Connection closed by 147.75.109.163 port 33176 Dec 13 13:43:05.868814 sshd-session[7858]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:05.876975 systemd-logind[1464]: Session 63 logged out. Waiting for processes to exit. Dec 13 13:43:05.878228 systemd[1]: sshd@64-188.245.225.138:22-147.75.109.163:33176.service: Deactivated successfully. Dec 13 13:43:05.883745 systemd[1]: session-63.scope: Deactivated successfully. Dec 13 13:43:05.885794 systemd-logind[1464]: Removed session 63. Dec 13 13:43:11.056386 systemd[1]: Started sshd@65-188.245.225.138:22-147.75.109.163:40618.service - OpenSSH per-connection server daemon (147.75.109.163:40618). Dec 13 13:43:12.055217 sshd[7890]: Accepted publickey for core from 147.75.109.163 port 40618 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:12.057549 sshd-session[7890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:12.063586 systemd-logind[1464]: New session 64 of user core. Dec 13 13:43:12.071820 systemd[1]: Started session-64.scope - Session 64 of User core. Dec 13 13:43:12.827246 sshd[7892]: Connection closed by 147.75.109.163 port 40618 Dec 13 13:43:12.828113 sshd-session[7890]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:12.836179 systemd[1]: sshd@65-188.245.225.138:22-147.75.109.163:40618.service: Deactivated successfully. Dec 13 13:43:12.842596 systemd[1]: session-64.scope: Deactivated successfully. Dec 13 13:43:12.844838 systemd-logind[1464]: Session 64 logged out. Waiting for processes to exit. Dec 13 13:43:12.847742 systemd-logind[1464]: Removed session 64. Dec 13 13:43:18.005272 systemd[1]: Started sshd@66-188.245.225.138:22-147.75.109.163:41872.service - OpenSSH per-connection server daemon (147.75.109.163:41872). Dec 13 13:43:18.996657 sshd[7905]: Accepted publickey for core from 147.75.109.163 port 41872 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:18.999170 sshd-session[7905]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:19.006954 systemd-logind[1464]: New session 65 of user core. Dec 13 13:43:19.012810 systemd[1]: Started session-65.scope - Session 65 of User core. Dec 13 13:43:19.757094 sshd[7907]: Connection closed by 147.75.109.163 port 41872 Dec 13 13:43:19.757827 sshd-session[7905]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:19.765403 systemd[1]: sshd@66-188.245.225.138:22-147.75.109.163:41872.service: Deactivated successfully. Dec 13 13:43:19.766272 systemd-logind[1464]: Session 65 logged out. Waiting for processes to exit. Dec 13 13:43:19.770086 systemd[1]: session-65.scope: Deactivated successfully. Dec 13 13:43:19.772373 systemd-logind[1464]: Removed session 65. Dec 13 13:43:24.933967 systemd[1]: Started sshd@67-188.245.225.138:22-147.75.109.163:41888.service - OpenSSH per-connection server daemon (147.75.109.163:41888). Dec 13 13:43:25.927000 sshd[7926]: Accepted publickey for core from 147.75.109.163 port 41888 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:25.930250 sshd-session[7926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:25.937753 systemd-logind[1464]: New session 66 of user core. Dec 13 13:43:25.944811 systemd[1]: Started session-66.scope - Session 66 of User core. Dec 13 13:43:26.691280 sshd[7928]: Connection closed by 147.75.109.163 port 41888 Dec 13 13:43:26.692097 sshd-session[7926]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:26.696340 systemd[1]: sshd@67-188.245.225.138:22-147.75.109.163:41888.service: Deactivated successfully. Dec 13 13:43:26.698713 systemd[1]: session-66.scope: Deactivated successfully. Dec 13 13:43:26.699816 systemd-logind[1464]: Session 66 logged out. Waiting for processes to exit. Dec 13 13:43:26.702641 systemd-logind[1464]: Removed session 66. Dec 13 13:43:31.865931 systemd[1]: Started sshd@68-188.245.225.138:22-147.75.109.163:40714.service - OpenSSH per-connection server daemon (147.75.109.163:40714). Dec 13 13:43:32.852297 sshd[7961]: Accepted publickey for core from 147.75.109.163 port 40714 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:32.856081 sshd-session[7961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:32.868269 systemd-logind[1464]: New session 67 of user core. Dec 13 13:43:32.877751 systemd[1]: Started session-67.scope - Session 67 of User core. Dec 13 13:43:33.621724 sshd[7964]: Connection closed by 147.75.109.163 port 40714 Dec 13 13:43:33.622838 sshd-session[7961]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:33.629294 systemd[1]: sshd@68-188.245.225.138:22-147.75.109.163:40714.service: Deactivated successfully. Dec 13 13:43:33.633523 systemd[1]: session-67.scope: Deactivated successfully. Dec 13 13:43:33.635562 systemd-logind[1464]: Session 67 logged out. Waiting for processes to exit. Dec 13 13:43:33.638395 systemd-logind[1464]: Removed session 67. Dec 13 13:43:33.789011 systemd[1]: run-containerd-runc-k8s.io-7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe-runc.jbDkDo.mount: Deactivated successfully. Dec 13 13:43:38.798946 systemd[1]: Started sshd@69-188.245.225.138:22-147.75.109.163:34870.service - OpenSSH per-connection server daemon (147.75.109.163:34870). Dec 13 13:43:39.790877 sshd[7994]: Accepted publickey for core from 147.75.109.163 port 34870 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:39.793321 sshd-session[7994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:39.799679 systemd-logind[1464]: New session 68 of user core. Dec 13 13:43:39.808297 systemd[1]: Started session-68.scope - Session 68 of User core. Dec 13 13:43:40.568226 sshd[8008]: Connection closed by 147.75.109.163 port 34870 Dec 13 13:43:40.569477 sshd-session[7994]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:40.572848 systemd[1]: sshd@69-188.245.225.138:22-147.75.109.163:34870.service: Deactivated successfully. Dec 13 13:43:40.575995 systemd[1]: session-68.scope: Deactivated successfully. Dec 13 13:43:40.577945 systemd-logind[1464]: Session 68 logged out. Waiting for processes to exit. Dec 13 13:43:40.579344 systemd-logind[1464]: Removed session 68. Dec 13 13:43:45.748646 systemd[1]: Started sshd@70-188.245.225.138:22-147.75.109.163:34880.service - OpenSSH per-connection server daemon (147.75.109.163:34880). Dec 13 13:43:46.739249 sshd[8025]: Accepted publickey for core from 147.75.109.163 port 34880 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:46.741942 sshd-session[8025]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:46.750689 systemd-logind[1464]: New session 69 of user core. Dec 13 13:43:46.760121 systemd[1]: Started session-69.scope - Session 69 of User core. Dec 13 13:43:47.498855 sshd[8029]: Connection closed by 147.75.109.163 port 34880 Dec 13 13:43:47.502418 sshd-session[8025]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:47.508745 systemd[1]: sshd@70-188.245.225.138:22-147.75.109.163:34880.service: Deactivated successfully. Dec 13 13:43:47.511626 systemd[1]: session-69.scope: Deactivated successfully. Dec 13 13:43:47.513448 systemd-logind[1464]: Session 69 logged out. Waiting for processes to exit. Dec 13 13:43:47.515278 systemd-logind[1464]: Removed session 69. Dec 13 13:43:52.671905 systemd[1]: Started sshd@71-188.245.225.138:22-147.75.109.163:38134.service - OpenSSH per-connection server daemon (147.75.109.163:38134). Dec 13 13:43:53.668043 sshd[8040]: Accepted publickey for core from 147.75.109.163 port 38134 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:43:53.673102 sshd-session[8040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:43:53.679969 systemd-logind[1464]: New session 70 of user core. Dec 13 13:43:53.685763 systemd[1]: Started session-70.scope - Session 70 of User core. Dec 13 13:43:54.435248 sshd[8042]: Connection closed by 147.75.109.163 port 38134 Dec 13 13:43:54.436540 sshd-session[8040]: pam_unix(sshd:session): session closed for user core Dec 13 13:43:54.442693 systemd[1]: sshd@71-188.245.225.138:22-147.75.109.163:38134.service: Deactivated successfully. Dec 13 13:43:54.445934 systemd[1]: session-70.scope: Deactivated successfully. Dec 13 13:43:54.447181 systemd-logind[1464]: Session 70 logged out. Waiting for processes to exit. Dec 13 13:43:54.449721 systemd-logind[1464]: Removed session 70. Dec 13 13:43:59.611167 systemd[1]: Started sshd@72-188.245.225.138:22-147.75.109.163:33162.service - OpenSSH per-connection server daemon (147.75.109.163:33162). Dec 13 13:44:00.598753 sshd[8053]: Accepted publickey for core from 147.75.109.163 port 33162 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:00.601201 sshd-session[8053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:00.608941 systemd-logind[1464]: New session 71 of user core. Dec 13 13:44:00.613741 systemd[1]: Started session-71.scope - Session 71 of User core. Dec 13 13:44:01.369662 sshd[8057]: Connection closed by 147.75.109.163 port 33162 Dec 13 13:44:01.370926 sshd-session[8053]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:01.376733 systemd[1]: sshd@72-188.245.225.138:22-147.75.109.163:33162.service: Deactivated successfully. Dec 13 13:44:01.380947 systemd[1]: session-71.scope: Deactivated successfully. Dec 13 13:44:01.386402 systemd-logind[1464]: Session 71 logged out. Waiting for processes to exit. Dec 13 13:44:01.390118 systemd-logind[1464]: Removed session 71. Dec 13 13:44:06.547016 systemd[1]: Started sshd@73-188.245.225.138:22-147.75.109.163:59586.service - OpenSSH per-connection server daemon (147.75.109.163:59586). Dec 13 13:44:07.532263 sshd[8109]: Accepted publickey for core from 147.75.109.163 port 59586 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:07.534183 sshd-session[8109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:07.542596 systemd-logind[1464]: New session 72 of user core. Dec 13 13:44:07.548743 systemd[1]: Started session-72.scope - Session 72 of User core. Dec 13 13:44:08.312752 sshd[8111]: Connection closed by 147.75.109.163 port 59586 Dec 13 13:44:08.313454 sshd-session[8109]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:08.317887 systemd[1]: sshd@73-188.245.225.138:22-147.75.109.163:59586.service: Deactivated successfully. Dec 13 13:44:08.322006 systemd[1]: session-72.scope: Deactivated successfully. Dec 13 13:44:08.327055 systemd-logind[1464]: Session 72 logged out. Waiting for processes to exit. Dec 13 13:44:08.329039 systemd-logind[1464]: Removed session 72. Dec 13 13:44:13.503782 systemd[1]: Started sshd@74-188.245.225.138:22-147.75.109.163:59594.service - OpenSSH per-connection server daemon (147.75.109.163:59594). Dec 13 13:44:14.498487 sshd[8142]: Accepted publickey for core from 147.75.109.163 port 59594 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:14.501364 sshd-session[8142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:14.508395 systemd-logind[1464]: New session 73 of user core. Dec 13 13:44:14.513966 systemd[1]: Started session-73.scope - Session 73 of User core. Dec 13 13:44:15.280324 sshd[8144]: Connection closed by 147.75.109.163 port 59594 Dec 13 13:44:15.280818 sshd-session[8142]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:15.285778 systemd[1]: sshd@74-188.245.225.138:22-147.75.109.163:59594.service: Deactivated successfully. Dec 13 13:44:15.289595 systemd[1]: session-73.scope: Deactivated successfully. Dec 13 13:44:15.292043 systemd-logind[1464]: Session 73 logged out. Waiting for processes to exit. Dec 13 13:44:15.293920 systemd-logind[1464]: Removed session 73. Dec 13 13:44:20.455034 systemd[1]: Started sshd@75-188.245.225.138:22-147.75.109.163:36920.service - OpenSSH per-connection server daemon (147.75.109.163:36920). Dec 13 13:44:20.459117 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Dec 13 13:44:20.500963 systemd-tmpfiles[8158]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 13 13:44:20.501929 systemd-tmpfiles[8158]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 13 13:44:20.504740 systemd-tmpfiles[8158]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 13 13:44:20.507117 systemd-tmpfiles[8158]: ACLs are not supported, ignoring. Dec 13 13:44:20.507201 systemd-tmpfiles[8158]: ACLs are not supported, ignoring. Dec 13 13:44:20.513011 systemd-tmpfiles[8158]: Detected autofs mount point /boot during canonicalization of boot. Dec 13 13:44:20.513202 systemd-tmpfiles[8158]: Skipping /boot Dec 13 13:44:20.525167 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Dec 13 13:44:20.526157 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Dec 13 13:44:21.444751 sshd[8157]: Accepted publickey for core from 147.75.109.163 port 36920 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:21.446396 sshd-session[8157]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:21.452089 systemd-logind[1464]: New session 74 of user core. Dec 13 13:44:21.457791 systemd[1]: Started session-74.scope - Session 74 of User core. Dec 13 13:44:22.216072 sshd[8162]: Connection closed by 147.75.109.163 port 36920 Dec 13 13:44:22.216782 sshd-session[8157]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:22.224417 systemd[1]: sshd@75-188.245.225.138:22-147.75.109.163:36920.service: Deactivated successfully. Dec 13 13:44:22.227396 systemd[1]: session-74.scope: Deactivated successfully. Dec 13 13:44:22.229149 systemd-logind[1464]: Session 74 logged out. Waiting for processes to exit. Dec 13 13:44:22.230324 systemd-logind[1464]: Removed session 74. Dec 13 13:44:27.398394 systemd[1]: Started sshd@76-188.245.225.138:22-147.75.109.163:38766.service - OpenSSH per-connection server daemon (147.75.109.163:38766). Dec 13 13:44:28.392996 sshd[8173]: Accepted publickey for core from 147.75.109.163 port 38766 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:28.395584 sshd-session[8173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:28.403429 systemd-logind[1464]: New session 75 of user core. Dec 13 13:44:28.416044 systemd[1]: Started session-75.scope - Session 75 of User core. Dec 13 13:44:29.157361 sshd[8175]: Connection closed by 147.75.109.163 port 38766 Dec 13 13:44:29.160405 sshd-session[8173]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:29.167760 systemd[1]: session-75.scope: Deactivated successfully. Dec 13 13:44:29.170474 systemd-logind[1464]: Session 75 logged out. Waiting for processes to exit. Dec 13 13:44:29.172688 systemd[1]: sshd@76-188.245.225.138:22-147.75.109.163:38766.service: Deactivated successfully. Dec 13 13:44:29.177843 systemd-logind[1464]: Removed session 75. Dec 13 13:44:33.799252 systemd[1]: run-containerd-runc-k8s.io-7dec6487ded13cf5c9ca9b862bebb34cfff96100eca49a624ad2ed7d54dc1ebe-runc.E2Ix6W.mount: Deactivated successfully. Dec 13 13:44:34.331799 systemd[1]: Started sshd@77-188.245.225.138:22-147.75.109.163:38774.service - OpenSSH per-connection server daemon (147.75.109.163:38774). Dec 13 13:44:35.311178 sshd[8227]: Accepted publickey for core from 147.75.109.163 port 38774 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:35.313275 sshd-session[8227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:35.322728 systemd-logind[1464]: New session 76 of user core. Dec 13 13:44:35.327836 systemd[1]: Started session-76.scope - Session 76 of User core. Dec 13 13:44:36.076741 sshd[8229]: Connection closed by 147.75.109.163 port 38774 Dec 13 13:44:36.077962 sshd-session[8227]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:36.083210 systemd[1]: sshd@77-188.245.225.138:22-147.75.109.163:38774.service: Deactivated successfully. Dec 13 13:44:36.086534 systemd[1]: session-76.scope: Deactivated successfully. Dec 13 13:44:36.088920 systemd-logind[1464]: Session 76 logged out. Waiting for processes to exit. Dec 13 13:44:36.090153 systemd-logind[1464]: Removed session 76. Dec 13 13:44:41.266892 systemd[1]: Started sshd@78-188.245.225.138:22-147.75.109.163:46156.service - OpenSSH per-connection server daemon (147.75.109.163:46156). Dec 13 13:44:42.253427 sshd[8240]: Accepted publickey for core from 147.75.109.163 port 46156 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:42.259013 sshd-session[8240]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:42.274573 systemd-logind[1464]: New session 77 of user core. Dec 13 13:44:42.288196 systemd[1]: Started session-77.scope - Session 77 of User core. Dec 13 13:44:43.012526 sshd[8242]: Connection closed by 147.75.109.163 port 46156 Dec 13 13:44:43.011600 sshd-session[8240]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:43.016534 systemd[1]: sshd@78-188.245.225.138:22-147.75.109.163:46156.service: Deactivated successfully. Dec 13 13:44:43.019369 systemd[1]: session-77.scope: Deactivated successfully. Dec 13 13:44:43.022745 systemd-logind[1464]: Session 77 logged out. Waiting for processes to exit. Dec 13 13:44:43.023922 systemd-logind[1464]: Removed session 77. Dec 13 13:44:48.188902 systemd[1]: Started sshd@79-188.245.225.138:22-147.75.109.163:51942.service - OpenSSH per-connection server daemon (147.75.109.163:51942). Dec 13 13:44:49.188212 sshd[8255]: Accepted publickey for core from 147.75.109.163 port 51942 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:49.189884 sshd-session[8255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:49.197666 systemd-logind[1464]: New session 78 of user core. Dec 13 13:44:49.204899 systemd[1]: Started session-78.scope - Session 78 of User core. Dec 13 13:44:49.983014 sshd[8257]: Connection closed by 147.75.109.163 port 51942 Dec 13 13:44:49.983804 sshd-session[8255]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:49.987648 systemd[1]: sshd@79-188.245.225.138:22-147.75.109.163:51942.service: Deactivated successfully. Dec 13 13:44:49.990971 systemd[1]: session-78.scope: Deactivated successfully. Dec 13 13:44:49.995869 systemd-logind[1464]: Session 78 logged out. Waiting for processes to exit. Dec 13 13:44:49.999841 systemd-logind[1464]: Removed session 78. Dec 13 13:44:55.164424 systemd[1]: Started sshd@80-188.245.225.138:22-147.75.109.163:51950.service - OpenSSH per-connection server daemon (147.75.109.163:51950). Dec 13 13:44:56.161561 sshd[8268]: Accepted publickey for core from 147.75.109.163 port 51950 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:44:56.163699 sshd-session[8268]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:44:56.169366 systemd-logind[1464]: New session 79 of user core. Dec 13 13:44:56.174751 systemd[1]: Started session-79.scope - Session 79 of User core. Dec 13 13:44:56.944583 sshd[8270]: Connection closed by 147.75.109.163 port 51950 Dec 13 13:44:56.945263 sshd-session[8268]: pam_unix(sshd:session): session closed for user core Dec 13 13:44:56.949819 systemd[1]: sshd@80-188.245.225.138:22-147.75.109.163:51950.service: Deactivated successfully. Dec 13 13:44:56.952122 systemd[1]: session-79.scope: Deactivated successfully. Dec 13 13:44:56.953856 systemd-logind[1464]: Session 79 logged out. Waiting for processes to exit. Dec 13 13:44:56.955359 systemd-logind[1464]: Removed session 79. Dec 13 13:45:02.122031 systemd[1]: Started sshd@81-188.245.225.138:22-147.75.109.163:59520.service - OpenSSH per-connection server daemon (147.75.109.163:59520). Dec 13 13:45:03.125993 sshd[8306]: Accepted publickey for core from 147.75.109.163 port 59520 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:03.128872 sshd-session[8306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:03.134765 systemd-logind[1464]: New session 80 of user core. Dec 13 13:45:03.138795 systemd[1]: Started session-80.scope - Session 80 of User core. Dec 13 13:45:03.894716 sshd[8308]: Connection closed by 147.75.109.163 port 59520 Dec 13 13:45:03.894594 sshd-session[8306]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:03.899216 systemd[1]: sshd@81-188.245.225.138:22-147.75.109.163:59520.service: Deactivated successfully. Dec 13 13:45:03.903933 systemd[1]: session-80.scope: Deactivated successfully. Dec 13 13:45:03.906231 systemd-logind[1464]: Session 80 logged out. Waiting for processes to exit. Dec 13 13:45:03.907642 systemd-logind[1464]: Removed session 80. Dec 13 13:45:04.069518 systemd[1]: Started sshd@82-188.245.225.138:22-147.75.109.163:59530.service - OpenSSH per-connection server daemon (147.75.109.163:59530). Dec 13 13:45:05.054539 sshd[8344]: Accepted publickey for core from 147.75.109.163 port 59530 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:05.053927 sshd-session[8344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:05.065928 systemd-logind[1464]: New session 81 of user core. Dec 13 13:45:05.073108 systemd[1]: Started session-81.scope - Session 81 of User core. Dec 13 13:45:05.965397 sshd[8346]: Connection closed by 147.75.109.163 port 59530 Dec 13 13:45:05.966736 sshd-session[8344]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:05.972007 systemd-logind[1464]: Session 81 logged out. Waiting for processes to exit. Dec 13 13:45:05.972806 systemd[1]: sshd@82-188.245.225.138:22-147.75.109.163:59530.service: Deactivated successfully. Dec 13 13:45:05.976040 systemd[1]: session-81.scope: Deactivated successfully. Dec 13 13:45:05.977916 systemd-logind[1464]: Removed session 81. Dec 13 13:45:06.146306 systemd[1]: Started sshd@83-188.245.225.138:22-147.75.109.163:59536.service - OpenSSH per-connection server daemon (147.75.109.163:59536). Dec 13 13:45:07.127176 sshd[8357]: Accepted publickey for core from 147.75.109.163 port 59536 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:07.129711 sshd-session[8357]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:07.135347 systemd-logind[1464]: New session 82 of user core. Dec 13 13:45:07.147173 systemd[1]: Started session-82.scope - Session 82 of User core. Dec 13 13:45:09.778316 sshd[8359]: Connection closed by 147.75.109.163 port 59536 Dec 13 13:45:09.780844 sshd-session[8357]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:09.784953 systemd[1]: sshd@83-188.245.225.138:22-147.75.109.163:59536.service: Deactivated successfully. Dec 13 13:45:09.788117 systemd[1]: session-82.scope: Deactivated successfully. Dec 13 13:45:09.791368 systemd-logind[1464]: Session 82 logged out. Waiting for processes to exit. Dec 13 13:45:09.793382 systemd-logind[1464]: Removed session 82. Dec 13 13:45:09.955959 systemd[1]: Started sshd@84-188.245.225.138:22-147.75.109.163:38422.service - OpenSSH per-connection server daemon (147.75.109.163:38422). Dec 13 13:45:10.964911 sshd[8395]: Accepted publickey for core from 147.75.109.163 port 38422 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:10.966964 sshd-session[8395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:10.972748 systemd-logind[1464]: New session 83 of user core. Dec 13 13:45:10.980836 systemd[1]: Started session-83.scope - Session 83 of User core. Dec 13 13:45:11.875478 sshd[8397]: Connection closed by 147.75.109.163 port 38422 Dec 13 13:45:11.875123 sshd-session[8395]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:11.880810 systemd-logind[1464]: Session 83 logged out. Waiting for processes to exit. Dec 13 13:45:11.882128 systemd[1]: sshd@84-188.245.225.138:22-147.75.109.163:38422.service: Deactivated successfully. Dec 13 13:45:11.885269 systemd[1]: session-83.scope: Deactivated successfully. Dec 13 13:45:11.887449 systemd-logind[1464]: Removed session 83. Dec 13 13:45:12.054208 systemd[1]: Started sshd@85-188.245.225.138:22-147.75.109.163:38436.service - OpenSSH per-connection server daemon (147.75.109.163:38436). Dec 13 13:45:13.039611 sshd[8406]: Accepted publickey for core from 147.75.109.163 port 38436 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:13.041891 sshd-session[8406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:13.051172 systemd-logind[1464]: New session 84 of user core. Dec 13 13:45:13.058805 systemd[1]: Started session-84.scope - Session 84 of User core. Dec 13 13:45:13.803626 sshd[8408]: Connection closed by 147.75.109.163 port 38436 Dec 13 13:45:13.804921 sshd-session[8406]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:13.812446 systemd[1]: sshd@85-188.245.225.138:22-147.75.109.163:38436.service: Deactivated successfully. Dec 13 13:45:13.812999 systemd-logind[1464]: Session 84 logged out. Waiting for processes to exit. Dec 13 13:45:13.818664 systemd[1]: session-84.scope: Deactivated successfully. Dec 13 13:45:13.820142 systemd-logind[1464]: Removed session 84. Dec 13 13:45:18.986112 systemd[1]: Started sshd@86-188.245.225.138:22-147.75.109.163:41094.service - OpenSSH per-connection server daemon (147.75.109.163:41094). Dec 13 13:45:19.973777 sshd[8433]: Accepted publickey for core from 147.75.109.163 port 41094 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:19.976561 sshd-session[8433]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:19.983993 systemd-logind[1464]: New session 85 of user core. Dec 13 13:45:19.993925 systemd[1]: Started session-85.scope - Session 85 of User core. Dec 13 13:45:20.742079 sshd[8435]: Connection closed by 147.75.109.163 port 41094 Dec 13 13:45:20.742952 sshd-session[8433]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:20.748457 systemd[1]: sshd@86-188.245.225.138:22-147.75.109.163:41094.service: Deactivated successfully. Dec 13 13:45:20.753068 systemd[1]: session-85.scope: Deactivated successfully. Dec 13 13:45:20.755798 systemd-logind[1464]: Session 85 logged out. Waiting for processes to exit. Dec 13 13:45:20.758231 systemd-logind[1464]: Removed session 85. Dec 13 13:45:25.915279 systemd[1]: Started sshd@87-188.245.225.138:22-147.75.109.163:41102.service - OpenSSH per-connection server daemon (147.75.109.163:41102). Dec 13 13:45:26.903192 sshd[8451]: Accepted publickey for core from 147.75.109.163 port 41102 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:26.904967 sshd-session[8451]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:26.912373 systemd-logind[1464]: New session 86 of user core. Dec 13 13:45:26.916101 systemd[1]: Started session-86.scope - Session 86 of User core. Dec 13 13:45:27.664648 sshd[8453]: Connection closed by 147.75.109.163 port 41102 Dec 13 13:45:27.665418 sshd-session[8451]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:27.671973 systemd[1]: sshd@87-188.245.225.138:22-147.75.109.163:41102.service: Deactivated successfully. Dec 13 13:45:27.675076 systemd[1]: session-86.scope: Deactivated successfully. Dec 13 13:45:27.676295 systemd-logind[1464]: Session 86 logged out. Waiting for processes to exit. Dec 13 13:45:27.678933 systemd-logind[1464]: Removed session 86. Dec 13 13:45:32.839900 systemd[1]: Started sshd@88-188.245.225.138:22-147.75.109.163:52488.service - OpenSSH per-connection server daemon (147.75.109.163:52488). Dec 13 13:45:33.821559 sshd[8487]: Accepted publickey for core from 147.75.109.163 port 52488 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:33.824446 sshd-session[8487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:33.832331 systemd-logind[1464]: New session 87 of user core. Dec 13 13:45:33.837750 systemd[1]: Started session-87.scope - Session 87 of User core. Dec 13 13:45:34.586574 sshd[8508]: Connection closed by 147.75.109.163 port 52488 Dec 13 13:45:34.587433 sshd-session[8487]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:34.594848 systemd[1]: sshd@88-188.245.225.138:22-147.75.109.163:52488.service: Deactivated successfully. Dec 13 13:45:34.597956 systemd[1]: session-87.scope: Deactivated successfully. Dec 13 13:45:34.601370 systemd-logind[1464]: Session 87 logged out. Waiting for processes to exit. Dec 13 13:45:34.605313 systemd-logind[1464]: Removed session 87. Dec 13 13:45:39.767957 systemd[1]: Started sshd@89-188.245.225.138:22-147.75.109.163:59818.service - OpenSSH per-connection server daemon (147.75.109.163:59818). Dec 13 13:45:40.758119 sshd[8519]: Accepted publickey for core from 147.75.109.163 port 59818 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:40.759152 sshd-session[8519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:40.765449 systemd-logind[1464]: New session 88 of user core. Dec 13 13:45:40.776235 systemd[1]: Started session-88.scope - Session 88 of User core. Dec 13 13:45:41.520881 sshd[8521]: Connection closed by 147.75.109.163 port 59818 Dec 13 13:45:41.520654 sshd-session[8519]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:41.524979 systemd-logind[1464]: Session 88 logged out. Waiting for processes to exit. Dec 13 13:45:41.527234 systemd[1]: sshd@89-188.245.225.138:22-147.75.109.163:59818.service: Deactivated successfully. Dec 13 13:45:41.530384 systemd[1]: session-88.scope: Deactivated successfully. Dec 13 13:45:41.533573 systemd-logind[1464]: Removed session 88. Dec 13 13:45:46.703907 systemd[1]: Started sshd@90-188.245.225.138:22-147.75.109.163:49316.service - OpenSSH per-connection server daemon (147.75.109.163:49316). Dec 13 13:45:47.707663 sshd[8534]: Accepted publickey for core from 147.75.109.163 port 49316 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:47.710469 sshd-session[8534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:47.720770 systemd-logind[1464]: New session 89 of user core. Dec 13 13:45:47.727794 systemd[1]: Started session-89.scope - Session 89 of User core. Dec 13 13:45:48.482163 sshd[8536]: Connection closed by 147.75.109.163 port 49316 Dec 13 13:45:48.482964 sshd-session[8534]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:48.489101 systemd[1]: sshd@90-188.245.225.138:22-147.75.109.163:49316.service: Deactivated successfully. Dec 13 13:45:48.492619 systemd[1]: session-89.scope: Deactivated successfully. Dec 13 13:45:48.493900 systemd-logind[1464]: Session 89 logged out. Waiting for processes to exit. Dec 13 13:45:48.496330 systemd-logind[1464]: Removed session 89. Dec 13 13:45:53.666651 systemd[1]: Started sshd@91-188.245.225.138:22-147.75.109.163:49318.service - OpenSSH per-connection server daemon (147.75.109.163:49318). Dec 13 13:45:54.656693 sshd[8548]: Accepted publickey for core from 147.75.109.163 port 49318 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:45:54.659860 sshd-session[8548]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:45:54.674033 systemd-logind[1464]: New session 90 of user core. Dec 13 13:45:54.678421 systemd[1]: Started session-90.scope - Session 90 of User core. Dec 13 13:45:55.438554 sshd[8550]: Connection closed by 147.75.109.163 port 49318 Dec 13 13:45:55.439609 sshd-session[8548]: pam_unix(sshd:session): session closed for user core Dec 13 13:45:55.445714 systemd[1]: sshd@91-188.245.225.138:22-147.75.109.163:49318.service: Deactivated successfully. Dec 13 13:45:55.449433 systemd[1]: session-90.scope: Deactivated successfully. Dec 13 13:45:55.450676 systemd-logind[1464]: Session 90 logged out. Waiting for processes to exit. Dec 13 13:45:55.451890 systemd-logind[1464]: Removed session 90. Dec 13 13:46:00.621000 systemd[1]: Started sshd@92-188.245.225.138:22-147.75.109.163:39884.service - OpenSSH per-connection server daemon (147.75.109.163:39884). Dec 13 13:46:01.617521 sshd[8563]: Accepted publickey for core from 147.75.109.163 port 39884 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:01.620399 sshd-session[8563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:01.626779 systemd-logind[1464]: New session 91 of user core. Dec 13 13:46:01.632899 systemd[1]: Started session-91.scope - Session 91 of User core. Dec 13 13:46:02.405932 sshd[8586]: Connection closed by 147.75.109.163 port 39884 Dec 13 13:46:02.406604 sshd-session[8563]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:02.411281 systemd[1]: sshd@92-188.245.225.138:22-147.75.109.163:39884.service: Deactivated successfully. Dec 13 13:46:02.416260 systemd[1]: session-91.scope: Deactivated successfully. Dec 13 13:46:02.418318 systemd-logind[1464]: Session 91 logged out. Waiting for processes to exit. Dec 13 13:46:02.419985 systemd-logind[1464]: Removed session 91. Dec 13 13:46:07.575061 systemd[1]: Started sshd@93-188.245.225.138:22-147.75.109.163:60610.service - OpenSSH per-connection server daemon (147.75.109.163:60610). Dec 13 13:46:08.554663 sshd[8616]: Accepted publickey for core from 147.75.109.163 port 60610 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:08.556669 sshd-session[8616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:08.563256 systemd-logind[1464]: New session 92 of user core. Dec 13 13:46:08.567850 systemd[1]: Started session-92.scope - Session 92 of User core. Dec 13 13:46:09.334626 sshd[8618]: Connection closed by 147.75.109.163 port 60610 Dec 13 13:46:09.335353 sshd-session[8616]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:09.340280 systemd[1]: sshd@93-188.245.225.138:22-147.75.109.163:60610.service: Deactivated successfully. Dec 13 13:46:09.344133 systemd[1]: session-92.scope: Deactivated successfully. Dec 13 13:46:09.346227 systemd-logind[1464]: Session 92 logged out. Waiting for processes to exit. Dec 13 13:46:09.347466 systemd-logind[1464]: Removed session 92. Dec 13 13:46:14.521045 systemd[1]: Started sshd@94-188.245.225.138:22-147.75.109.163:60626.service - OpenSSH per-connection server daemon (147.75.109.163:60626). Dec 13 13:46:15.518385 sshd[8649]: Accepted publickey for core from 147.75.109.163 port 60626 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:15.519589 sshd-session[8649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:15.526408 systemd-logind[1464]: New session 93 of user core. Dec 13 13:46:15.530762 systemd[1]: Started session-93.scope - Session 93 of User core. Dec 13 13:46:16.280927 sshd[8651]: Connection closed by 147.75.109.163 port 60626 Dec 13 13:46:16.282046 sshd-session[8649]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:16.286576 systemd[1]: sshd@94-188.245.225.138:22-147.75.109.163:60626.service: Deactivated successfully. Dec 13 13:46:16.290130 systemd[1]: session-93.scope: Deactivated successfully. Dec 13 13:46:16.294102 systemd-logind[1464]: Session 93 logged out. Waiting for processes to exit. Dec 13 13:46:16.296697 systemd-logind[1464]: Removed session 93. Dec 13 13:46:21.460827 systemd[1]: Started sshd@95-188.245.225.138:22-147.75.109.163:53714.service - OpenSSH per-connection server daemon (147.75.109.163:53714). Dec 13 13:46:22.445856 sshd[8664]: Accepted publickey for core from 147.75.109.163 port 53714 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:22.449315 sshd-session[8664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:22.461098 systemd-logind[1464]: New session 94 of user core. Dec 13 13:46:22.467183 systemd[1]: Started session-94.scope - Session 94 of User core. Dec 13 13:46:23.207967 sshd[8666]: Connection closed by 147.75.109.163 port 53714 Dec 13 13:46:23.208904 sshd-session[8664]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:23.213887 systemd[1]: sshd@95-188.245.225.138:22-147.75.109.163:53714.service: Deactivated successfully. Dec 13 13:46:23.216956 systemd[1]: session-94.scope: Deactivated successfully. Dec 13 13:46:23.221022 systemd-logind[1464]: Session 94 logged out. Waiting for processes to exit. Dec 13 13:46:23.222095 systemd-logind[1464]: Removed session 94. Dec 13 13:46:28.388934 systemd[1]: Started sshd@96-188.245.225.138:22-147.75.109.163:37452.service - OpenSSH per-connection server daemon (147.75.109.163:37452). Dec 13 13:46:29.379549 sshd[8685]: Accepted publickey for core from 147.75.109.163 port 37452 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:29.383487 sshd-session[8685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:29.391385 systemd-logind[1464]: New session 95 of user core. Dec 13 13:46:29.397931 systemd[1]: Started session-95.scope - Session 95 of User core. Dec 13 13:46:30.145931 sshd[8687]: Connection closed by 147.75.109.163 port 37452 Dec 13 13:46:30.147248 sshd-session[8685]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:30.153181 systemd[1]: sshd@96-188.245.225.138:22-147.75.109.163:37452.service: Deactivated successfully. Dec 13 13:46:30.158054 systemd[1]: session-95.scope: Deactivated successfully. Dec 13 13:46:30.159855 systemd-logind[1464]: Session 95 logged out. Waiting for processes to exit. Dec 13 13:46:30.162521 systemd-logind[1464]: Removed session 95. Dec 13 13:46:35.328016 systemd[1]: Started sshd@97-188.245.225.138:22-147.75.109.163:37468.service - OpenSSH per-connection server daemon (147.75.109.163:37468). Dec 13 13:46:36.331351 sshd[8739]: Accepted publickey for core from 147.75.109.163 port 37468 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:36.334871 sshd-session[8739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:36.342763 systemd-logind[1464]: New session 96 of user core. Dec 13 13:46:36.352275 systemd[1]: Started session-96.scope - Session 96 of User core. Dec 13 13:46:37.121997 sshd[8741]: Connection closed by 147.75.109.163 port 37468 Dec 13 13:46:37.122586 sshd-session[8739]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:37.126698 systemd[1]: sshd@97-188.245.225.138:22-147.75.109.163:37468.service: Deactivated successfully. Dec 13 13:46:37.130095 systemd[1]: session-96.scope: Deactivated successfully. Dec 13 13:46:37.132076 systemd-logind[1464]: Session 96 logged out. Waiting for processes to exit. Dec 13 13:46:37.133835 systemd-logind[1464]: Removed session 96. Dec 13 13:46:42.299178 systemd[1]: Started sshd@98-188.245.225.138:22-147.75.109.163:42448.service - OpenSSH per-connection server daemon (147.75.109.163:42448). Dec 13 13:46:43.305548 sshd[8752]: Accepted publickey for core from 147.75.109.163 port 42448 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:43.307308 sshd-session[8752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:43.315264 systemd-logind[1464]: New session 97 of user core. Dec 13 13:46:43.323819 systemd[1]: Started session-97.scope - Session 97 of User core. Dec 13 13:46:44.080692 sshd[8754]: Connection closed by 147.75.109.163 port 42448 Dec 13 13:46:44.081477 sshd-session[8752]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:44.087795 systemd[1]: sshd@98-188.245.225.138:22-147.75.109.163:42448.service: Deactivated successfully. Dec 13 13:46:44.090761 systemd[1]: session-97.scope: Deactivated successfully. Dec 13 13:46:44.092006 systemd-logind[1464]: Session 97 logged out. Waiting for processes to exit. Dec 13 13:46:44.093946 systemd-logind[1464]: Removed session 97. Dec 13 13:46:49.256296 systemd[1]: Started sshd@99-188.245.225.138:22-147.75.109.163:53588.service - OpenSSH per-connection server daemon (147.75.109.163:53588). Dec 13 13:46:50.232457 sshd[8766]: Accepted publickey for core from 147.75.109.163 port 53588 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:50.236358 sshd-session[8766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:50.246383 systemd-logind[1464]: New session 98 of user core. Dec 13 13:46:50.252901 systemd[1]: Started session-98.scope - Session 98 of User core. Dec 13 13:46:50.993167 sshd[8769]: Connection closed by 147.75.109.163 port 53588 Dec 13 13:46:50.992757 sshd-session[8766]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:51.001487 systemd-logind[1464]: Session 98 logged out. Waiting for processes to exit. Dec 13 13:46:51.003939 systemd[1]: sshd@99-188.245.225.138:22-147.75.109.163:53588.service: Deactivated successfully. Dec 13 13:46:51.007453 systemd[1]: session-98.scope: Deactivated successfully. Dec 13 13:46:51.011161 systemd-logind[1464]: Removed session 98. Dec 13 13:46:55.316830 systemd[1]: Started sshd@100-188.245.225.138:22-159.223.178.117:60288.service - OpenSSH per-connection server daemon (159.223.178.117:60288). Dec 13 13:46:55.424850 sshd[8794]: Connection closed by 159.223.178.117 port 60288 Dec 13 13:46:55.427315 systemd[1]: sshd@100-188.245.225.138:22-159.223.178.117:60288.service: Deactivated successfully. Dec 13 13:46:56.171573 systemd[1]: Started sshd@101-188.245.225.138:22-147.75.109.163:53598.service - OpenSSH per-connection server daemon (147.75.109.163:53598). Dec 13 13:46:57.159665 sshd[8798]: Accepted publickey for core from 147.75.109.163 port 53598 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:46:57.162223 sshd-session[8798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:46:57.171022 systemd-logind[1464]: New session 99 of user core. Dec 13 13:46:57.176228 systemd[1]: Started session-99.scope - Session 99 of User core. Dec 13 13:46:57.937409 sshd[8800]: Connection closed by 147.75.109.163 port 53598 Dec 13 13:46:57.938652 sshd-session[8798]: pam_unix(sshd:session): session closed for user core Dec 13 13:46:57.943754 systemd-logind[1464]: Session 99 logged out. Waiting for processes to exit. Dec 13 13:46:57.944222 systemd[1]: sshd@101-188.245.225.138:22-147.75.109.163:53598.service: Deactivated successfully. Dec 13 13:46:57.947931 systemd[1]: session-99.scope: Deactivated successfully. Dec 13 13:46:57.953420 systemd-logind[1464]: Removed session 99. Dec 13 13:47:03.112931 systemd[1]: Started sshd@102-188.245.225.138:22-147.75.109.163:58606.service - OpenSSH per-connection server daemon (147.75.109.163:58606). Dec 13 13:47:04.100786 sshd[8838]: Accepted publickey for core from 147.75.109.163 port 58606 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:04.104258 sshd-session[8838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:04.112665 systemd-logind[1464]: New session 100 of user core. Dec 13 13:47:04.120809 systemd[1]: Started session-100.scope - Session 100 of User core. Dec 13 13:47:04.900569 sshd[8858]: Connection closed by 147.75.109.163 port 58606 Dec 13 13:47:04.901818 sshd-session[8838]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:04.906175 systemd[1]: sshd@102-188.245.225.138:22-147.75.109.163:58606.service: Deactivated successfully. Dec 13 13:47:04.909935 systemd[1]: session-100.scope: Deactivated successfully. Dec 13 13:47:04.913760 systemd-logind[1464]: Session 100 logged out. Waiting for processes to exit. Dec 13 13:47:04.915707 systemd-logind[1464]: Removed session 100. Dec 13 13:47:10.075002 systemd[1]: Started sshd@103-188.245.225.138:22-147.75.109.163:56308.service - OpenSSH per-connection server daemon (147.75.109.163:56308). Dec 13 13:47:11.062742 sshd[8887]: Accepted publickey for core from 147.75.109.163 port 56308 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:11.065972 sshd-session[8887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:11.074112 systemd-logind[1464]: New session 101 of user core. Dec 13 13:47:11.077728 systemd[1]: Started session-101.scope - Session 101 of User core. Dec 13 13:47:11.829248 sshd[8889]: Connection closed by 147.75.109.163 port 56308 Dec 13 13:47:11.830931 sshd-session[8887]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:11.838577 systemd[1]: sshd@103-188.245.225.138:22-147.75.109.163:56308.service: Deactivated successfully. Dec 13 13:47:11.842113 systemd[1]: session-101.scope: Deactivated successfully. Dec 13 13:47:11.844977 systemd-logind[1464]: Session 101 logged out. Waiting for processes to exit. Dec 13 13:47:11.846023 systemd-logind[1464]: Removed session 101. Dec 13 13:47:17.011608 systemd[1]: Started sshd@104-188.245.225.138:22-147.75.109.163:41016.service - OpenSSH per-connection server daemon (147.75.109.163:41016). Dec 13 13:47:18.009766 sshd[8902]: Accepted publickey for core from 147.75.109.163 port 41016 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:18.012721 sshd-session[8902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:18.019282 systemd-logind[1464]: New session 102 of user core. Dec 13 13:47:18.027824 systemd[1]: Started session-102.scope - Session 102 of User core. Dec 13 13:47:18.801584 sshd[8904]: Connection closed by 147.75.109.163 port 41016 Dec 13 13:47:18.802023 sshd-session[8902]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:18.808255 systemd-logind[1464]: Session 102 logged out. Waiting for processes to exit. Dec 13 13:47:18.808970 systemd[1]: sshd@104-188.245.225.138:22-147.75.109.163:41016.service: Deactivated successfully. Dec 13 13:47:18.812954 systemd[1]: session-102.scope: Deactivated successfully. Dec 13 13:47:18.814475 systemd-logind[1464]: Removed session 102. Dec 13 13:47:23.987124 systemd[1]: Started sshd@105-188.245.225.138:22-147.75.109.163:41022.service - OpenSSH per-connection server daemon (147.75.109.163:41022). Dec 13 13:47:24.970475 sshd[8915]: Accepted publickey for core from 147.75.109.163 port 41022 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:24.973319 sshd-session[8915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:24.980954 systemd-logind[1464]: New session 103 of user core. Dec 13 13:47:24.984739 systemd[1]: Started session-103.scope - Session 103 of User core. Dec 13 13:47:25.740066 sshd[8917]: Connection closed by 147.75.109.163 port 41022 Dec 13 13:47:25.739650 sshd-session[8915]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:25.745141 systemd[1]: sshd@105-188.245.225.138:22-147.75.109.163:41022.service: Deactivated successfully. Dec 13 13:47:25.748390 systemd[1]: session-103.scope: Deactivated successfully. Dec 13 13:47:25.752839 systemd-logind[1464]: Session 103 logged out. Waiting for processes to exit. Dec 13 13:47:25.755077 systemd-logind[1464]: Removed session 103. Dec 13 13:47:30.916033 systemd[1]: Started sshd@106-188.245.225.138:22-147.75.109.163:56590.service - OpenSSH per-connection server daemon (147.75.109.163:56590). Dec 13 13:47:31.902158 sshd[8949]: Accepted publickey for core from 147.75.109.163 port 56590 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:31.905061 sshd-session[8949]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:31.911603 systemd-logind[1464]: New session 104 of user core. Dec 13 13:47:31.920862 systemd[1]: Started session-104.scope - Session 104 of User core. Dec 13 13:47:32.654603 sshd[8952]: Connection closed by 147.75.109.163 port 56590 Dec 13 13:47:32.656354 sshd-session[8949]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:32.661202 systemd[1]: sshd@106-188.245.225.138:22-147.75.109.163:56590.service: Deactivated successfully. Dec 13 13:47:32.664204 systemd[1]: session-104.scope: Deactivated successfully. Dec 13 13:47:32.666349 systemd-logind[1464]: Session 104 logged out. Waiting for processes to exit. Dec 13 13:47:32.667695 systemd-logind[1464]: Removed session 104. Dec 13 13:47:37.832278 systemd[1]: Started sshd@107-188.245.225.138:22-147.75.109.163:39300.service - OpenSSH per-connection server daemon (147.75.109.163:39300). Dec 13 13:47:38.826141 sshd[8983]: Accepted publickey for core from 147.75.109.163 port 39300 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:38.828576 sshd-session[8983]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:38.835772 systemd-logind[1464]: New session 105 of user core. Dec 13 13:47:38.842903 systemd[1]: Started session-105.scope - Session 105 of User core. Dec 13 13:47:39.581847 sshd[8985]: Connection closed by 147.75.109.163 port 39300 Dec 13 13:47:39.582680 sshd-session[8983]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:39.586133 systemd[1]: sshd@107-188.245.225.138:22-147.75.109.163:39300.service: Deactivated successfully. Dec 13 13:47:39.591388 systemd[1]: session-105.scope: Deactivated successfully. Dec 13 13:47:39.595588 systemd-logind[1464]: Session 105 logged out. Waiting for processes to exit. Dec 13 13:47:39.597787 systemd-logind[1464]: Removed session 105. Dec 13 13:47:44.754979 systemd[1]: Started sshd@108-188.245.225.138:22-147.75.109.163:39308.service - OpenSSH per-connection server daemon (147.75.109.163:39308). Dec 13 13:47:45.755707 sshd[8996]: Accepted publickey for core from 147.75.109.163 port 39308 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:45.757897 sshd-session[8996]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:45.763308 systemd-logind[1464]: New session 106 of user core. Dec 13 13:47:45.771734 systemd[1]: Started session-106.scope - Session 106 of User core. Dec 13 13:47:46.517531 sshd[8998]: Connection closed by 147.75.109.163 port 39308 Dec 13 13:47:46.516807 sshd-session[8996]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:46.520972 systemd[1]: sshd@108-188.245.225.138:22-147.75.109.163:39308.service: Deactivated successfully. Dec 13 13:47:46.524249 systemd[1]: session-106.scope: Deactivated successfully. Dec 13 13:47:46.528883 systemd-logind[1464]: Session 106 logged out. Waiting for processes to exit. Dec 13 13:47:46.532270 systemd-logind[1464]: Removed session 106. Dec 13 13:47:51.699954 systemd[1]: Started sshd@109-188.245.225.138:22-147.75.109.163:35356.service - OpenSSH per-connection server daemon (147.75.109.163:35356). Dec 13 13:47:52.689390 sshd[9011]: Accepted publickey for core from 147.75.109.163 port 35356 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:52.691341 sshd-session[9011]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:52.697419 systemd-logind[1464]: New session 107 of user core. Dec 13 13:47:52.702750 systemd[1]: Started session-107.scope - Session 107 of User core. Dec 13 13:47:53.456133 sshd[9013]: Connection closed by 147.75.109.163 port 35356 Dec 13 13:47:53.456858 sshd-session[9011]: pam_unix(sshd:session): session closed for user core Dec 13 13:47:53.461253 systemd[1]: sshd@109-188.245.225.138:22-147.75.109.163:35356.service: Deactivated successfully. Dec 13 13:47:53.464437 systemd[1]: session-107.scope: Deactivated successfully. Dec 13 13:47:53.468661 systemd-logind[1464]: Session 107 logged out. Waiting for processes to exit. Dec 13 13:47:53.470992 systemd-logind[1464]: Removed session 107. Dec 13 13:47:58.634686 systemd[1]: Started sshd@110-188.245.225.138:22-147.75.109.163:43886.service - OpenSSH per-connection server daemon (147.75.109.163:43886). Dec 13 13:47:59.626794 sshd[9024]: Accepted publickey for core from 147.75.109.163 port 43886 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:47:59.629374 sshd-session[9024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:47:59.635864 systemd-logind[1464]: New session 108 of user core. Dec 13 13:47:59.641802 systemd[1]: Started session-108.scope - Session 108 of User core. Dec 13 13:48:00.407510 sshd[9026]: Connection closed by 147.75.109.163 port 43886 Dec 13 13:48:00.408324 sshd-session[9024]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:00.414165 systemd[1]: sshd@110-188.245.225.138:22-147.75.109.163:43886.service: Deactivated successfully. Dec 13 13:48:00.419528 systemd[1]: session-108.scope: Deactivated successfully. Dec 13 13:48:00.422245 systemd-logind[1464]: Session 108 logged out. Waiting for processes to exit. Dec 13 13:48:00.423535 systemd-logind[1464]: Removed session 108. Dec 13 13:48:05.587874 systemd[1]: Started sshd@111-188.245.225.138:22-147.75.109.163:43896.service - OpenSSH per-connection server daemon (147.75.109.163:43896). Dec 13 13:48:06.573962 sshd[9081]: Accepted publickey for core from 147.75.109.163 port 43896 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:06.576533 sshd-session[9081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:06.585370 systemd-logind[1464]: New session 109 of user core. Dec 13 13:48:06.590893 systemd[1]: Started session-109.scope - Session 109 of User core. Dec 13 13:48:07.338396 sshd[9083]: Connection closed by 147.75.109.163 port 43896 Dec 13 13:48:07.339366 sshd-session[9081]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:07.346040 systemd[1]: sshd@111-188.245.225.138:22-147.75.109.163:43896.service: Deactivated successfully. Dec 13 13:48:07.350199 systemd[1]: session-109.scope: Deactivated successfully. Dec 13 13:48:07.351279 systemd-logind[1464]: Session 109 logged out. Waiting for processes to exit. Dec 13 13:48:07.352698 systemd-logind[1464]: Removed session 109. Dec 13 13:48:12.518540 systemd[1]: Started sshd@112-188.245.225.138:22-147.75.109.163:42852.service - OpenSSH per-connection server daemon (147.75.109.163:42852). Dec 13 13:48:13.501896 sshd[9114]: Accepted publickey for core from 147.75.109.163 port 42852 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:13.505674 sshd-session[9114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:13.512024 systemd-logind[1464]: New session 110 of user core. Dec 13 13:48:13.517845 systemd[1]: Started session-110.scope - Session 110 of User core. Dec 13 13:48:14.274738 sshd[9116]: Connection closed by 147.75.109.163 port 42852 Dec 13 13:48:14.276017 sshd-session[9114]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:14.282202 systemd[1]: sshd@112-188.245.225.138:22-147.75.109.163:42852.service: Deactivated successfully. Dec 13 13:48:14.286273 systemd[1]: session-110.scope: Deactivated successfully. Dec 13 13:48:14.289731 systemd-logind[1464]: Session 110 logged out. Waiting for processes to exit. Dec 13 13:48:14.292398 systemd-logind[1464]: Removed session 110. Dec 13 13:48:19.465261 systemd[1]: Started sshd@113-188.245.225.138:22-147.75.109.163:34446.service - OpenSSH per-connection server daemon (147.75.109.163:34446). Dec 13 13:48:20.473093 sshd[9129]: Accepted publickey for core from 147.75.109.163 port 34446 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:20.475750 sshd-session[9129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:20.481071 systemd-logind[1464]: New session 111 of user core. Dec 13 13:48:20.491869 systemd[1]: Started session-111.scope - Session 111 of User core. Dec 13 13:48:21.253597 sshd[9131]: Connection closed by 147.75.109.163 port 34446 Dec 13 13:48:21.254653 sshd-session[9129]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:21.267666 systemd[1]: sshd@113-188.245.225.138:22-147.75.109.163:34446.service: Deactivated successfully. Dec 13 13:48:21.272394 systemd[1]: session-111.scope: Deactivated successfully. Dec 13 13:48:21.274167 systemd-logind[1464]: Session 111 logged out. Waiting for processes to exit. Dec 13 13:48:21.276638 systemd-logind[1464]: Removed session 111. Dec 13 13:48:26.423560 systemd[1]: Started sshd@114-188.245.225.138:22-147.75.109.163:36242.service - OpenSSH per-connection server daemon (147.75.109.163:36242). Dec 13 13:48:27.421602 sshd[9141]: Accepted publickey for core from 147.75.109.163 port 36242 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:27.424214 sshd-session[9141]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:27.431663 systemd-logind[1464]: New session 112 of user core. Dec 13 13:48:27.438780 systemd[1]: Started session-112.scope - Session 112 of User core. Dec 13 13:48:28.192074 sshd[9143]: Connection closed by 147.75.109.163 port 36242 Dec 13 13:48:28.193031 sshd-session[9141]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:28.199158 systemd[1]: sshd@114-188.245.225.138:22-147.75.109.163:36242.service: Deactivated successfully. Dec 13 13:48:28.203331 systemd[1]: session-112.scope: Deactivated successfully. Dec 13 13:48:28.204811 systemd-logind[1464]: Session 112 logged out. Waiting for processes to exit. Dec 13 13:48:28.206243 systemd-logind[1464]: Removed session 112. Dec 13 13:48:33.373036 systemd[1]: Started sshd@115-188.245.225.138:22-147.75.109.163:36252.service - OpenSSH per-connection server daemon (147.75.109.163:36252). Dec 13 13:48:34.353028 sshd[9193]: Accepted publickey for core from 147.75.109.163 port 36252 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:34.356755 sshd-session[9193]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:34.367327 systemd-logind[1464]: New session 113 of user core. Dec 13 13:48:34.373057 systemd[1]: Started session-113.scope - Session 113 of User core. Dec 13 13:48:35.122250 sshd[9216]: Connection closed by 147.75.109.163 port 36252 Dec 13 13:48:35.123130 sshd-session[9193]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:35.130034 systemd-logind[1464]: Session 113 logged out. Waiting for processes to exit. Dec 13 13:48:35.131360 systemd[1]: sshd@115-188.245.225.138:22-147.75.109.163:36252.service: Deactivated successfully. Dec 13 13:48:35.139658 systemd[1]: session-113.scope: Deactivated successfully. Dec 13 13:48:35.143623 systemd-logind[1464]: Removed session 113. Dec 13 13:48:40.305248 systemd[1]: Started sshd@116-188.245.225.138:22-147.75.109.163:40666.service - OpenSSH per-connection server daemon (147.75.109.163:40666). Dec 13 13:48:41.290191 sshd[9227]: Accepted publickey for core from 147.75.109.163 port 40666 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:41.292364 sshd-session[9227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:41.300105 systemd-logind[1464]: New session 114 of user core. Dec 13 13:48:41.309917 systemd[1]: Started session-114.scope - Session 114 of User core. Dec 13 13:48:42.056607 sshd[9229]: Connection closed by 147.75.109.163 port 40666 Dec 13 13:48:42.057303 sshd-session[9227]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:42.061465 systemd[1]: sshd@116-188.245.225.138:22-147.75.109.163:40666.service: Deactivated successfully. Dec 13 13:48:42.061558 systemd-logind[1464]: Session 114 logged out. Waiting for processes to exit. Dec 13 13:48:42.064390 systemd[1]: session-114.scope: Deactivated successfully. Dec 13 13:48:42.067624 systemd-logind[1464]: Removed session 114. Dec 13 13:48:47.238892 systemd[1]: Started sshd@117-188.245.225.138:22-147.75.109.163:60914.service - OpenSSH per-connection server daemon (147.75.109.163:60914). Dec 13 13:48:48.237196 sshd[9242]: Accepted publickey for core from 147.75.109.163 port 60914 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:48.239276 sshd-session[9242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:48.245691 systemd-logind[1464]: New session 115 of user core. Dec 13 13:48:48.251807 systemd[1]: Started session-115.scope - Session 115 of User core. Dec 13 13:48:49.012401 sshd[9244]: Connection closed by 147.75.109.163 port 60914 Dec 13 13:48:49.012578 sshd-session[9242]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:49.016827 systemd[1]: session-115.scope: Deactivated successfully. Dec 13 13:48:49.017648 systemd[1]: sshd@117-188.245.225.138:22-147.75.109.163:60914.service: Deactivated successfully. Dec 13 13:48:49.021764 systemd-logind[1464]: Session 115 logged out. Waiting for processes to exit. Dec 13 13:48:49.023116 systemd-logind[1464]: Removed session 115. Dec 13 13:48:54.199484 systemd[1]: Started sshd@118-188.245.225.138:22-147.75.109.163:60924.service - OpenSSH per-connection server daemon (147.75.109.163:60924). Dec 13 13:48:55.199616 sshd[9255]: Accepted publickey for core from 147.75.109.163 port 60924 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:48:55.202786 sshd-session[9255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:48:55.210908 systemd-logind[1464]: New session 116 of user core. Dec 13 13:48:55.218991 systemd[1]: Started session-116.scope - Session 116 of User core. Dec 13 13:48:55.990160 sshd[9257]: Connection closed by 147.75.109.163 port 60924 Dec 13 13:48:55.990006 sshd-session[9255]: pam_unix(sshd:session): session closed for user core Dec 13 13:48:56.002651 systemd-logind[1464]: Session 116 logged out. Waiting for processes to exit. Dec 13 13:48:56.002953 systemd[1]: sshd@118-188.245.225.138:22-147.75.109.163:60924.service: Deactivated successfully. Dec 13 13:48:56.007617 systemd[1]: session-116.scope: Deactivated successfully. Dec 13 13:48:56.010997 systemd-logind[1464]: Removed session 116. Dec 13 13:49:00.874774 systemd[1]: run-containerd-runc-k8s.io-df920426d450aefc540d6dda8e746b82b1bcd1b065538ff1394664803b384e8f-runc.Txi0Ol.mount: Deactivated successfully. Dec 13 13:49:01.167491 systemd[1]: Started sshd@119-188.245.225.138:22-147.75.109.163:54814.service - OpenSSH per-connection server daemon (147.75.109.163:54814). Dec 13 13:49:02.165480 sshd[9290]: Accepted publickey for core from 147.75.109.163 port 54814 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:49:02.166867 sshd-session[9290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:49:02.180646 systemd-logind[1464]: New session 117 of user core. Dec 13 13:49:02.192957 systemd[1]: Started session-117.scope - Session 117 of User core. Dec 13 13:49:02.937071 sshd[9292]: Connection closed by 147.75.109.163 port 54814 Dec 13 13:49:02.938525 sshd-session[9290]: pam_unix(sshd:session): session closed for user core Dec 13 13:49:02.944310 systemd[1]: sshd@119-188.245.225.138:22-147.75.109.163:54814.service: Deactivated successfully. Dec 13 13:49:02.948356 systemd[1]: session-117.scope: Deactivated successfully. Dec 13 13:49:02.949427 systemd-logind[1464]: Session 117 logged out. Waiting for processes to exit. Dec 13 13:49:02.950721 systemd-logind[1464]: Removed session 117. Dec 13 13:49:08.118121 systemd[1]: Started sshd@120-188.245.225.138:22-147.75.109.163:60254.service - OpenSSH per-connection server daemon (147.75.109.163:60254). Dec 13 13:49:09.126379 sshd[9323]: Accepted publickey for core from 147.75.109.163 port 60254 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:49:09.129367 sshd-session[9323]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:49:09.140569 systemd-logind[1464]: New session 118 of user core. Dec 13 13:49:09.143803 systemd[1]: Started session-118.scope - Session 118 of User core. Dec 13 13:49:09.907978 sshd[9325]: Connection closed by 147.75.109.163 port 60254 Dec 13 13:49:09.908732 sshd-session[9323]: pam_unix(sshd:session): session closed for user core Dec 13 13:49:09.914078 systemd-logind[1464]: Session 118 logged out. Waiting for processes to exit. Dec 13 13:49:09.914897 systemd[1]: sshd@120-188.245.225.138:22-147.75.109.163:60254.service: Deactivated successfully. Dec 13 13:49:09.920240 systemd[1]: session-118.scope: Deactivated successfully. Dec 13 13:49:09.923374 systemd-logind[1464]: Removed session 118. Dec 13 13:49:15.095976 systemd[1]: Started sshd@121-188.245.225.138:22-147.75.109.163:60270.service - OpenSSH per-connection server daemon (147.75.109.163:60270). Dec 13 13:49:16.102249 sshd[9354]: Accepted publickey for core from 147.75.109.163 port 60270 ssh2: RSA SHA256:mujnIWCHI6g7RhOIFUnU4L23td/nvZma7inzBKSUIRw Dec 13 13:49:16.104753 sshd-session[9354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 13 13:49:16.113647 systemd-logind[1464]: New session 119 of user core. Dec 13 13:49:16.125853 systemd[1]: Started session-119.scope - Session 119 of User core. Dec 13 13:49:16.871522 sshd[9358]: Connection closed by 147.75.109.163 port 60270 Dec 13 13:49:16.870413 sshd-session[9354]: pam_unix(sshd:session): session closed for user core Dec 13 13:49:16.874422 systemd-logind[1464]: Session 119 logged out. Waiting for processes to exit. Dec 13 13:49:16.874802 systemd[1]: sshd@121-188.245.225.138:22-147.75.109.163:60270.service: Deactivated successfully. Dec 13 13:49:16.877957 systemd[1]: session-119.scope: Deactivated successfully. Dec 13 13:49:16.880713 systemd-logind[1464]: Removed session 119. Dec 13 13:49:42.433802 systemd[1]: cri-containerd-c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b.scope: Deactivated successfully. Dec 13 13:49:42.434436 systemd[1]: cri-containerd-c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b.scope: Consumed 14.954s CPU time, 22.3M memory peak, 0B memory swap peak. Dec 13 13:49:42.464358 containerd[1480]: time="2024-12-13T13:49:42.464294907Z" level=info msg="shim disconnected" id=c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b namespace=k8s.io Dec 13 13:49:42.464948 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b-rootfs.mount: Deactivated successfully. Dec 13 13:49:42.465059 containerd[1480]: time="2024-12-13T13:49:42.464929916Z" level=warning msg="cleaning up after shim disconnected" id=c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b namespace=k8s.io Dec 13 13:49:42.465847 containerd[1480]: time="2024-12-13T13:49:42.465654245Z" level=info msg="cleaning up dead shim" namespace=k8s.io Dec 13 13:49:42.620559 kubelet[2825]: I1213 13:49:42.620004 2825 scope.go:117] "RemoveContainer" containerID="c96285d0da7ad406c6eb8f09ad7db9ef7a35732f69f5d07d6a8c68e30991d69b" Dec 13 13:49:42.623704 containerd[1480]: time="2024-12-13T13:49:42.623567506Z" level=info msg="CreateContainer within sandbox \"9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 13 13:49:42.636757 containerd[1480]: time="2024-12-13T13:49:42.636571236Z" level=info msg="CreateContainer within sandbox \"9d69d2be4a639691e5ca68ca2bfcdfb6179c3e3feaa548cc20aaae6955def004\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"a6cdb242d568a36953d948b99329b4031caade526fc09434701034a448621d98\"" Dec 13 13:49:42.642269 containerd[1480]: time="2024-12-13T13:49:42.637988094Z" level=info msg="StartContainer for \"a6cdb242d568a36953d948b99329b4031caade526fc09434701034a448621d98\"" Dec 13 13:49:42.684881 systemd[1]: Started cri-containerd-a6cdb242d568a36953d948b99329b4031caade526fc09434701034a448621d98.scope - libcontainer container a6cdb242d568a36953d948b99329b4031caade526fc09434701034a448621d98. Dec 13 13:49:42.726616 systemd[1]: cri-containerd-906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469.scope: Deactivated successfully. Dec 13 13:49:42.727400 systemd[1]: cri-containerd-906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469.scope: Consumed 12.562s CPU time. Dec 13 13:49:42.735448 containerd[1480]: time="2024-12-13T13:49:42.735395046Z" level=info msg="StartContainer for \"a6cdb242d568a36953d948b99329b4031caade526fc09434701034a448621d98\" returns successfully" Dec 13 13:49:42.756322 containerd[1480]: time="2024-12-13T13:49:42.756146037Z" level=info msg="shim disconnected" id=906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469 namespace=k8s.io Dec 13 13:49:42.756322 containerd[1480]: time="2024-12-13T13:49:42.756314799Z" level=warning msg="cleaning up after shim disconnected" id=906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469 namespace=k8s.io Dec 13 13:49:42.756322 containerd[1480]: time="2024-12-13T13:49:42.756326159Z" level=info msg="cleaning up dead shim" namespace=k8s.io Dec 13 13:49:42.840528 kubelet[2825]: E1213 13:49:42.840483 2825 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:41158->10.0.0.2:2379: read: connection timed out" Dec 13 13:49:43.466356 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469-rootfs.mount: Deactivated successfully. Dec 13 13:49:43.619876 kubelet[2825]: I1213 13:49:43.619843 2825 scope.go:117] "RemoveContainer" containerID="906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469" Dec 13 13:49:43.635168 containerd[1480]: time="2024-12-13T13:49:43.635104501Z" level=info msg="CreateContainer within sandbox \"6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 13 13:49:43.656050 containerd[1480]: time="2024-12-13T13:49:43.656000215Z" level=info msg="CreateContainer within sandbox \"6d63c3b1335ce106d435a359cf37834429d2598f3b93dd16385f9436d936640a\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180\"" Dec 13 13:49:43.656614 containerd[1480]: time="2024-12-13T13:49:43.656585902Z" level=info msg="StartContainer for \"bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180\"" Dec 13 13:49:43.712714 systemd[1]: Started cri-containerd-bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180.scope - libcontainer container bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180. Dec 13 13:49:43.777979 containerd[1480]: time="2024-12-13T13:49:43.777774770Z" level=info msg="StartContainer for \"bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180\" returns successfully" Dec 13 13:49:44.464131 systemd[1]: run-containerd-runc-k8s.io-bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180-runc.J9MChz.mount: Deactivated successfully. Dec 13 13:49:47.646310 systemd[1]: cri-containerd-bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180.scope: Deactivated successfully. Dec 13 13:49:47.676218 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180-rootfs.mount: Deactivated successfully. Dec 13 13:49:47.687348 containerd[1480]: time="2024-12-13T13:49:47.687255912Z" level=info msg="shim disconnected" id=bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180 namespace=k8s.io Dec 13 13:49:47.687348 containerd[1480]: time="2024-12-13T13:49:47.687318552Z" level=warning msg="cleaning up after shim disconnected" id=bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180 namespace=k8s.io Dec 13 13:49:47.687348 containerd[1480]: time="2024-12-13T13:49:47.687326552Z" level=info msg="cleaning up dead shim" namespace=k8s.io Dec 13 13:49:48.218677 kubelet[2825]: E1213 13:49:48.218059 2825 event.go:346] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40966->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4186-0-0-4-8ed7fad560.1810c0bdc0682196 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4186-0-0-4-8ed7fad560,UID:cc94f6c454172adfcff39394410207d5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4186-0-0-4-8ed7fad560,},FirstTimestamp:2024-12-13 13:49:37.74704271 +0000 UTC m=+1058.224349972,LastTimestamp:2024-12-13 13:49:37.74704271 +0000 UTC m=+1058.224349972,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4186-0-0-4-8ed7fad560,}" Dec 13 13:49:48.585944 systemd[1]: cri-containerd-1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad.scope: Deactivated successfully. Dec 13 13:49:48.587114 systemd[1]: cri-containerd-1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad.scope: Consumed 4.327s CPU time, 16.1M memory peak, 0B memory swap peak. Dec 13 13:49:48.615143 containerd[1480]: time="2024-12-13T13:49:48.615066484Z" level=info msg="shim disconnected" id=1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad namespace=k8s.io Dec 13 13:49:48.615143 containerd[1480]: time="2024-12-13T13:49:48.615137325Z" level=warning msg="cleaning up after shim disconnected" id=1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad namespace=k8s.io Dec 13 13:49:48.615331 containerd[1480]: time="2024-12-13T13:49:48.615201086Z" level=info msg="cleaning up dead shim" namespace=k8s.io Dec 13 13:49:48.616583 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1241fba9acb5d40a1e3501040f8d9c2102cf733221fb95a9c624ba3275f45cad-rootfs.mount: Deactivated successfully. Dec 13 13:49:48.641922 kubelet[2825]: I1213 13:49:48.641719 2825 scope.go:117] "RemoveContainer" containerID="906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469" Dec 13 13:49:48.643413 kubelet[2825]: I1213 13:49:48.642737 2825 scope.go:117] "RemoveContainer" containerID="bf9ba30e7746083e5b5c2ac5eb2134a2e0f05630a3941e69e33185627424a180" Dec 13 13:49:48.643413 kubelet[2825]: E1213 13:49:48.643035 2825 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-c7ccbd65-8kfhh_tigera-operator(7dbec970-de14-4ab3-8181-5e888a7371e4)\"" pod="tigera-operator/tigera-operator-c7ccbd65-8kfhh" podUID="7dbec970-de14-4ab3-8181-5e888a7371e4" Dec 13 13:49:48.645413 containerd[1480]: time="2024-12-13T13:49:48.645059764Z" level=info msg="RemoveContainer for \"906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469\"" Dec 13 13:49:48.652762 containerd[1480]: time="2024-12-13T13:49:48.652715706Z" level=info msg="RemoveContainer for \"906893bbc2deccbaab088f4afa0393ea52188ea1e6229d70ce836a8756ae1469\" returns successfully"