Jan 29 11:03:02.906642 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 29 11:03:02.906665 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Wed Jan 29 09:37:00 -00 2025 Jan 29 11:03:02.906675 kernel: KASLR enabled Jan 29 11:03:02.906681 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 29 11:03:02.906687 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x138595418 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d98 Jan 29 11:03:02.906692 kernel: random: crng init done Jan 29 11:03:02.906699 kernel: secureboot: Secure boot disabled Jan 29 11:03:02.906705 kernel: ACPI: Early table checksum verification disabled Jan 29 11:03:02.906711 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jan 29 11:03:02.906719 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jan 29 11:03:02.906725 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906731 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906737 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906742 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906750 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906757 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906764 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906770 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906776 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 11:03:02.906782 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jan 29 11:03:02.906788 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 29 11:03:02.906794 kernel: NUMA: Failed to initialise from firmware Jan 29 11:03:02.906800 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jan 29 11:03:02.906806 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] Jan 29 11:03:02.906828 kernel: Zone ranges: Jan 29 11:03:02.907883 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 29 11:03:02.907895 kernel: DMA32 empty Jan 29 11:03:02.907902 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jan 29 11:03:02.907908 kernel: Movable zone start for each node Jan 29 11:03:02.907914 kernel: Early memory node ranges Jan 29 11:03:02.907920 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Jan 29 11:03:02.907927 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jan 29 11:03:02.907933 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jan 29 11:03:02.907939 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jan 29 11:03:02.907945 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jan 29 11:03:02.907951 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jan 29 11:03:02.907958 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jan 29 11:03:02.907968 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jan 29 11:03:02.907974 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 29 11:03:02.907981 kernel: psci: probing for conduit method from ACPI. Jan 29 11:03:02.907990 kernel: psci: PSCIv1.1 detected in firmware. Jan 29 11:03:02.907996 kernel: psci: Using standard PSCI v0.2 function IDs Jan 29 11:03:02.908003 kernel: psci: Trusted OS migration not required Jan 29 11:03:02.908011 kernel: psci: SMC Calling Convention v1.1 Jan 29 11:03:02.908018 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 29 11:03:02.908025 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Jan 29 11:03:02.908031 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Jan 29 11:03:02.908038 kernel: pcpu-alloc: [0] 0 [0] 1 Jan 29 11:03:02.908044 kernel: Detected PIPT I-cache on CPU0 Jan 29 11:03:02.908051 kernel: CPU features: detected: GIC system register CPU interface Jan 29 11:03:02.908058 kernel: CPU features: detected: Hardware dirty bit management Jan 29 11:03:02.908064 kernel: CPU features: detected: Spectre-v4 Jan 29 11:03:02.908071 kernel: CPU features: detected: Spectre-BHB Jan 29 11:03:02.908078 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 29 11:03:02.908085 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 29 11:03:02.908092 kernel: CPU features: detected: ARM erratum 1418040 Jan 29 11:03:02.908098 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 29 11:03:02.908105 kernel: alternatives: applying boot alternatives Jan 29 11:03:02.908113 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=c8edc06d36325e34bb125a9ad39c4f788eb9f01102631b71efea3f9afa94c89e Jan 29 11:03:02.908120 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 29 11:03:02.908127 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 29 11:03:02.908133 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 29 11:03:02.908140 kernel: Fallback order for Node 0: 0 Jan 29 11:03:02.908146 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Jan 29 11:03:02.908155 kernel: Policy zone: Normal Jan 29 11:03:02.908161 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 29 11:03:02.908168 kernel: software IO TLB: area num 2. Jan 29 11:03:02.908174 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Jan 29 11:03:02.908181 kernel: Memory: 3882680K/4096000K available (10240K kernel code, 2186K rwdata, 8096K rodata, 39680K init, 897K bss, 213320K reserved, 0K cma-reserved) Jan 29 11:03:02.908188 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 29 11:03:02.908195 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 29 11:03:02.908202 kernel: rcu: RCU event tracing is enabled. Jan 29 11:03:02.908209 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 29 11:03:02.908215 kernel: Trampoline variant of Tasks RCU enabled. Jan 29 11:03:02.908222 kernel: Tracing variant of Tasks RCU enabled. Jan 29 11:03:02.908229 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 29 11:03:02.908237 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 29 11:03:02.908256 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 29 11:03:02.908263 kernel: GICv3: 256 SPIs implemented Jan 29 11:03:02.908269 kernel: GICv3: 0 Extended SPIs implemented Jan 29 11:03:02.908276 kernel: Root IRQ handler: gic_handle_irq Jan 29 11:03:02.908282 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 29 11:03:02.908288 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 29 11:03:02.908295 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 29 11:03:02.908301 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Jan 29 11:03:02.908308 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Jan 29 11:03:02.908315 kernel: GICv3: using LPI property table @0x00000001000e0000 Jan 29 11:03:02.908326 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Jan 29 11:03:02.908333 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 29 11:03:02.908339 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 29 11:03:02.908346 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 29 11:03:02.908353 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 29 11:03:02.908360 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 29 11:03:02.908366 kernel: Console: colour dummy device 80x25 Jan 29 11:03:02.908374 kernel: ACPI: Core revision 20230628 Jan 29 11:03:02.908381 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 29 11:03:02.908388 kernel: pid_max: default: 32768 minimum: 301 Jan 29 11:03:02.908396 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 29 11:03:02.908403 kernel: landlock: Up and running. Jan 29 11:03:02.908410 kernel: SELinux: Initializing. Jan 29 11:03:02.908417 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 29 11:03:02.908424 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 29 11:03:02.908430 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 29 11:03:02.908437 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 29 11:03:02.908444 kernel: rcu: Hierarchical SRCU implementation. Jan 29 11:03:02.908451 kernel: rcu: Max phase no-delay instances is 400. Jan 29 11:03:02.908458 kernel: Platform MSI: ITS@0x8080000 domain created Jan 29 11:03:02.908466 kernel: PCI/MSI: ITS@0x8080000 domain created Jan 29 11:03:02.908473 kernel: Remapping and enabling EFI services. Jan 29 11:03:02.908479 kernel: smp: Bringing up secondary CPUs ... Jan 29 11:03:02.908486 kernel: Detected PIPT I-cache on CPU1 Jan 29 11:03:02.908493 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 29 11:03:02.908500 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Jan 29 11:03:02.908507 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 29 11:03:02.908513 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 29 11:03:02.908520 kernel: smp: Brought up 1 node, 2 CPUs Jan 29 11:03:02.908529 kernel: SMP: Total of 2 processors activated. Jan 29 11:03:02.908536 kernel: CPU features: detected: 32-bit EL0 Support Jan 29 11:03:02.908548 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 29 11:03:02.908557 kernel: CPU features: detected: Common not Private translations Jan 29 11:03:02.908564 kernel: CPU features: detected: CRC32 instructions Jan 29 11:03:02.908571 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 29 11:03:02.908578 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 29 11:03:02.908585 kernel: CPU features: detected: LSE atomic instructions Jan 29 11:03:02.908592 kernel: CPU features: detected: Privileged Access Never Jan 29 11:03:02.908601 kernel: CPU features: detected: RAS Extension Support Jan 29 11:03:02.908608 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 29 11:03:02.908616 kernel: CPU: All CPU(s) started at EL1 Jan 29 11:03:02.908623 kernel: alternatives: applying system-wide alternatives Jan 29 11:03:02.908630 kernel: devtmpfs: initialized Jan 29 11:03:02.908637 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 29 11:03:02.908644 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 29 11:03:02.908651 kernel: pinctrl core: initialized pinctrl subsystem Jan 29 11:03:02.908660 kernel: SMBIOS 3.0.0 present. Jan 29 11:03:02.908667 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jan 29 11:03:02.908674 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 29 11:03:02.908681 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jan 29 11:03:02.908689 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 29 11:03:02.908696 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 29 11:03:02.908703 kernel: audit: initializing netlink subsys (disabled) Jan 29 11:03:02.908710 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Jan 29 11:03:02.908719 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 29 11:03:02.908726 kernel: cpuidle: using governor menu Jan 29 11:03:02.908733 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 29 11:03:02.908740 kernel: ASID allocator initialised with 32768 entries Jan 29 11:03:02.908747 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 29 11:03:02.908754 kernel: Serial: AMBA PL011 UART driver Jan 29 11:03:02.908761 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 29 11:03:02.908769 kernel: Modules: 0 pages in range for non-PLT usage Jan 29 11:03:02.908776 kernel: Modules: 508960 pages in range for PLT usage Jan 29 11:03:02.908783 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 29 11:03:02.908792 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 29 11:03:02.908799 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 29 11:03:02.908806 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 29 11:03:02.909866 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 29 11:03:02.909883 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 29 11:03:02.909890 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 29 11:03:02.909898 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 29 11:03:02.909905 kernel: ACPI: Added _OSI(Module Device) Jan 29 11:03:02.909912 kernel: ACPI: Added _OSI(Processor Device) Jan 29 11:03:02.909925 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 29 11:03:02.909932 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 29 11:03:02.909940 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 29 11:03:02.909947 kernel: ACPI: Interpreter enabled Jan 29 11:03:02.909954 kernel: ACPI: Using GIC for interrupt routing Jan 29 11:03:02.909961 kernel: ACPI: MCFG table detected, 1 entries Jan 29 11:03:02.909968 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 29 11:03:02.909976 kernel: printk: console [ttyAMA0] enabled Jan 29 11:03:02.909983 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 29 11:03:02.910151 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 29 11:03:02.910224 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 29 11:03:02.910315 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 29 11:03:02.910383 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 29 11:03:02.910447 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 29 11:03:02.910456 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 29 11:03:02.910464 kernel: PCI host bridge to bus 0000:00 Jan 29 11:03:02.910544 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 29 11:03:02.910605 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 29 11:03:02.910664 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 29 11:03:02.910722 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 29 11:03:02.910809 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Jan 29 11:03:02.911603 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Jan 29 11:03:02.911702 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Jan 29 11:03:02.911786 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Jan 29 11:03:02.912527 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.912615 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Jan 29 11:03:02.912694 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.912763 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Jan 29 11:03:02.913934 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914024 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Jan 29 11:03:02.914104 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914169 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Jan 29 11:03:02.914275 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914357 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Jan 29 11:03:02.914435 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914502 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Jan 29 11:03:02.914573 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914639 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Jan 29 11:03:02.914709 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914775 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Jan 29 11:03:02.914885 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Jan 29 11:03:02.914964 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Jan 29 11:03:02.915036 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Jan 29 11:03:02.915102 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Jan 29 11:03:02.915179 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Jan 29 11:03:02.915271 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Jan 29 11:03:02.915346 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Jan 29 11:03:02.915419 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Jan 29 11:03:02.915494 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 29 11:03:02.915564 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Jan 29 11:03:02.915644 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Jan 29 11:03:02.915713 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Jan 29 11:03:02.915781 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Jan 29 11:03:02.916764 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Jan 29 11:03:02.916933 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Jan 29 11:03:02.917020 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 29 11:03:02.917089 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] Jan 29 11:03:02.917157 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Jan 29 11:03:02.917231 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Jan 29 11:03:02.917349 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Jan 29 11:03:02.917427 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Jan 29 11:03:02.917503 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Jan 29 11:03:02.917572 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Jan 29 11:03:02.917640 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Jan 29 11:03:02.917707 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Jan 29 11:03:02.917776 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 29 11:03:02.917877 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 29 11:03:02.917946 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 29 11:03:02.918016 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 29 11:03:02.918089 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 29 11:03:02.918155 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 29 11:03:02.918222 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 29 11:03:02.918303 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 29 11:03:02.918371 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 29 11:03:02.918444 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 29 11:03:02.918510 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 29 11:03:02.918576 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 29 11:03:02.918763 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 29 11:03:02.918932 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 29 11:03:02.919006 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 29 11:03:02.919075 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 29 11:03:02.919146 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 29 11:03:02.919210 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 29 11:03:02.919293 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 29 11:03:02.919359 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jan 29 11:03:02.919423 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jan 29 11:03:02.919491 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 29 11:03:02.919555 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 29 11:03:02.919619 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 29 11:03:02.919689 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 29 11:03:02.919753 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 29 11:03:02.919869 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 29 11:03:02.921141 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Jan 29 11:03:02.921211 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Jan 29 11:03:02.921292 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Jan 29 11:03:02.921359 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Jan 29 11:03:02.921433 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Jan 29 11:03:02.921498 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Jan 29 11:03:02.921566 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Jan 29 11:03:02.921630 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Jan 29 11:03:02.921699 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Jan 29 11:03:02.921764 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Jan 29 11:03:02.923332 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Jan 29 11:03:02.923421 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 29 11:03:02.923499 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Jan 29 11:03:02.923568 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 29 11:03:02.923636 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Jan 29 11:03:02.923702 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 29 11:03:02.923771 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Jan 29 11:03:02.923884 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Jan 29 11:03:02.923960 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Jan 29 11:03:02.924024 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Jan 29 11:03:02.924094 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Jan 29 11:03:02.924170 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 29 11:03:02.924268 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Jan 29 11:03:02.924338 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 29 11:03:02.924404 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Jan 29 11:03:02.924474 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 29 11:03:02.924542 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Jan 29 11:03:02.924607 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 29 11:03:02.924673 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Jan 29 11:03:02.924741 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 29 11:03:02.924807 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Jan 29 11:03:02.924915 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 29 11:03:02.924982 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Jan 29 11:03:02.925051 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 29 11:03:02.925116 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Jan 29 11:03:02.925179 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 29 11:03:02.925274 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Jan 29 11:03:02.925350 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Jan 29 11:03:02.925420 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Jan 29 11:03:02.925491 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Jan 29 11:03:02.925560 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Jan 29 11:03:02.925631 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Jan 29 11:03:02.925698 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 29 11:03:02.925762 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 29 11:03:02.925910 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jan 29 11:03:02.925982 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 29 11:03:02.926054 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Jan 29 11:03:02.926124 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 29 11:03:02.926188 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 29 11:03:02.926265 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jan 29 11:03:02.926332 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 29 11:03:02.926458 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Jan 29 11:03:02.926527 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Jan 29 11:03:02.926597 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 29 11:03:02.926662 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 29 11:03:02.926725 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jan 29 11:03:02.926788 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 29 11:03:02.926921 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Jan 29 11:03:02.926990 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 29 11:03:02.927054 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 29 11:03:02.927117 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jan 29 11:03:02.927184 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 29 11:03:02.927296 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Jan 29 11:03:02.927373 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] Jan 29 11:03:02.927439 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 29 11:03:02.927504 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 29 11:03:02.927567 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jan 29 11:03:02.927631 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 29 11:03:02.927705 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Jan 29 11:03:02.927776 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Jan 29 11:03:02.927890 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 29 11:03:02.927963 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 29 11:03:02.928028 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 29 11:03:02.928092 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 29 11:03:02.928164 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Jan 29 11:03:02.928231 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Jan 29 11:03:02.928310 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Jan 29 11:03:02.928383 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 29 11:03:02.928452 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 29 11:03:02.928516 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 29 11:03:02.928666 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 29 11:03:02.928743 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 29 11:03:02.928810 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 29 11:03:02.928906 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 29 11:03:02.928973 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 29 11:03:02.929045 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 29 11:03:02.929112 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jan 29 11:03:02.929177 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jan 29 11:03:02.929255 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 29 11:03:02.929331 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 29 11:03:02.929392 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 29 11:03:02.929452 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 29 11:03:02.929526 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 29 11:03:02.929588 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 29 11:03:02.929648 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 29 11:03:02.929717 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jan 29 11:03:02.929778 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 29 11:03:02.929883 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 29 11:03:02.929954 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jan 29 11:03:02.930019 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 29 11:03:02.930091 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 29 11:03:02.930169 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jan 29 11:03:02.930230 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 29 11:03:02.930330 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 29 11:03:02.930406 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jan 29 11:03:02.930471 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 29 11:03:02.930533 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 29 11:03:02.930606 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jan 29 11:03:02.930670 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 29 11:03:02.930733 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 29 11:03:02.930802 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jan 29 11:03:02.930893 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 29 11:03:02.930956 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 29 11:03:02.931024 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jan 29 11:03:02.931086 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 29 11:03:02.931146 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 29 11:03:02.931219 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jan 29 11:03:02.931295 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 29 11:03:02.931366 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 29 11:03:02.931376 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 29 11:03:02.931384 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 29 11:03:02.931391 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 29 11:03:02.931399 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 29 11:03:02.931407 kernel: iommu: Default domain type: Translated Jan 29 11:03:02.931417 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 29 11:03:02.931424 kernel: efivars: Registered efivars operations Jan 29 11:03:02.931432 kernel: vgaarb: loaded Jan 29 11:03:02.931439 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 29 11:03:02.931447 kernel: VFS: Disk quotas dquot_6.6.0 Jan 29 11:03:02.931454 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 29 11:03:02.931462 kernel: pnp: PnP ACPI init Jan 29 11:03:02.931542 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 29 11:03:02.931555 kernel: pnp: PnP ACPI: found 1 devices Jan 29 11:03:02.931563 kernel: NET: Registered PF_INET protocol family Jan 29 11:03:02.931570 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 29 11:03:02.931578 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 29 11:03:02.931586 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 29 11:03:02.931593 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 29 11:03:02.931601 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 29 11:03:02.931610 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 29 11:03:02.931617 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 29 11:03:02.931627 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 29 11:03:02.931634 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 29 11:03:02.931709 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 29 11:03:02.931720 kernel: PCI: CLS 0 bytes, default 64 Jan 29 11:03:02.931728 kernel: kvm [1]: HYP mode not available Jan 29 11:03:02.931735 kernel: Initialise system trusted keyrings Jan 29 11:03:02.931743 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 29 11:03:02.931751 kernel: Key type asymmetric registered Jan 29 11:03:02.931758 kernel: Asymmetric key parser 'x509' registered Jan 29 11:03:02.931767 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 29 11:03:02.931775 kernel: io scheduler mq-deadline registered Jan 29 11:03:02.931782 kernel: io scheduler kyber registered Jan 29 11:03:02.931790 kernel: io scheduler bfq registered Jan 29 11:03:02.931799 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 29 11:03:02.931956 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jan 29 11:03:02.932035 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jan 29 11:03:02.932100 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.932171 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jan 29 11:03:02.932236 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jan 29 11:03:02.932341 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.932409 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jan 29 11:03:02.932475 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jan 29 11:03:02.932539 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.932610 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jan 29 11:03:02.932675 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jan 29 11:03:02.932740 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.932806 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jan 29 11:03:02.932998 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jan 29 11:03:02.933065 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.933136 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jan 29 11:03:02.933201 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jan 29 11:03:02.933280 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.933348 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jan 29 11:03:02.933412 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jan 29 11:03:02.933476 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.933544 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jan 29 11:03:02.933608 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jan 29 11:03:02.933671 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.933681 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 29 11:03:02.933745 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jan 29 11:03:02.933811 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jan 29 11:03:02.933890 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 11:03:02.933901 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 29 11:03:02.933909 kernel: ACPI: button: Power Button [PWRB] Jan 29 11:03:02.933917 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 29 11:03:02.933987 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 29 11:03:02.934057 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jan 29 11:03:02.934068 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 29 11:03:02.934076 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 29 11:03:02.934142 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jan 29 11:03:02.934154 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jan 29 11:03:02.934162 kernel: thunder_xcv, ver 1.0 Jan 29 11:03:02.934169 kernel: thunder_bgx, ver 1.0 Jan 29 11:03:02.934177 kernel: nicpf, ver 1.0 Jan 29 11:03:02.934184 kernel: nicvf, ver 1.0 Jan 29 11:03:02.934272 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 29 11:03:02.934339 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-01-29T11:03:02 UTC (1738148582) Jan 29 11:03:02.934349 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 29 11:03:02.934359 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Jan 29 11:03:02.934367 kernel: watchdog: Delayed init of the lockup detector failed: -19 Jan 29 11:03:02.934375 kernel: watchdog: Hard watchdog permanently disabled Jan 29 11:03:02.934382 kernel: NET: Registered PF_INET6 protocol family Jan 29 11:03:02.934390 kernel: Segment Routing with IPv6 Jan 29 11:03:02.934397 kernel: In-situ OAM (IOAM) with IPv6 Jan 29 11:03:02.934405 kernel: NET: Registered PF_PACKET protocol family Jan 29 11:03:02.934414 kernel: Key type dns_resolver registered Jan 29 11:03:02.934422 kernel: registered taskstats version 1 Jan 29 11:03:02.934430 kernel: Loading compiled-in X.509 certificates Jan 29 11:03:02.934438 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: f3333311a24aa8c58222f4e98a07eaa1f186ad1a' Jan 29 11:03:02.934445 kernel: Key type .fscrypt registered Jan 29 11:03:02.934453 kernel: Key type fscrypt-provisioning registered Jan 29 11:03:02.934460 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 29 11:03:02.934468 kernel: ima: Allocated hash algorithm: sha1 Jan 29 11:03:02.934475 kernel: ima: No architecture policies found Jan 29 11:03:02.934483 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 29 11:03:02.934492 kernel: clk: Disabling unused clocks Jan 29 11:03:02.934499 kernel: Freeing unused kernel memory: 39680K Jan 29 11:03:02.934507 kernel: Run /init as init process Jan 29 11:03:02.934514 kernel: with arguments: Jan 29 11:03:02.934521 kernel: /init Jan 29 11:03:02.934529 kernel: with environment: Jan 29 11:03:02.934536 kernel: HOME=/ Jan 29 11:03:02.934544 kernel: TERM=linux Jan 29 11:03:02.934551 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 29 11:03:02.934560 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 29 11:03:02.934572 systemd[1]: Detected virtualization kvm. Jan 29 11:03:02.934580 systemd[1]: Detected architecture arm64. Jan 29 11:03:02.934588 systemd[1]: Running in initrd. Jan 29 11:03:02.934596 systemd[1]: No hostname configured, using default hostname. Jan 29 11:03:02.934603 systemd[1]: Hostname set to . Jan 29 11:03:02.934611 systemd[1]: Initializing machine ID from VM UUID. Jan 29 11:03:02.934621 systemd[1]: Queued start job for default target initrd.target. Jan 29 11:03:02.934629 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 29 11:03:02.934637 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 29 11:03:02.934645 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 29 11:03:02.934654 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 29 11:03:02.934662 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 29 11:03:02.936844 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 29 11:03:02.936872 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 29 11:03:02.936888 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 29 11:03:02.936896 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 29 11:03:02.936904 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 29 11:03:02.936912 systemd[1]: Reached target paths.target - Path Units. Jan 29 11:03:02.936920 systemd[1]: Reached target slices.target - Slice Units. Jan 29 11:03:02.936928 systemd[1]: Reached target swap.target - Swaps. Jan 29 11:03:02.936936 systemd[1]: Reached target timers.target - Timer Units. Jan 29 11:03:02.936944 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 29 11:03:02.936954 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 29 11:03:02.936963 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 29 11:03:02.936971 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 29 11:03:02.936979 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 29 11:03:02.936987 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 29 11:03:02.936995 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 29 11:03:02.937003 systemd[1]: Reached target sockets.target - Socket Units. Jan 29 11:03:02.937011 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 29 11:03:02.937021 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 29 11:03:02.937029 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 29 11:03:02.937037 systemd[1]: Starting systemd-fsck-usr.service... Jan 29 11:03:02.937045 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 29 11:03:02.937054 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 29 11:03:02.937062 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 11:03:02.937070 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 29 11:03:02.937077 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 29 11:03:02.937118 systemd-journald[236]: Collecting audit messages is disabled. Jan 29 11:03:02.937141 systemd[1]: Finished systemd-fsck-usr.service. Jan 29 11:03:02.937152 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 29 11:03:02.937161 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 29 11:03:02.937169 kernel: Bridge firewalling registered Jan 29 11:03:02.937177 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 29 11:03:02.937185 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 11:03:02.937193 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 29 11:03:02.937202 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 29 11:03:02.937213 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 29 11:03:02.937221 systemd-journald[236]: Journal started Jan 29 11:03:02.937240 systemd-journald[236]: Runtime Journal (/run/log/journal/96563e3e90a54625af3538fa1d80f21c) is 8.0M, max 76.6M, 68.6M free. Jan 29 11:03:02.897875 systemd-modules-load[238]: Inserted module 'overlay' Jan 29 11:03:02.917994 systemd-modules-load[238]: Inserted module 'br_netfilter' Jan 29 11:03:02.940848 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 29 11:03:02.940888 systemd[1]: Started systemd-journald.service - Journal Service. Jan 29 11:03:02.947810 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 29 11:03:02.953599 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 11:03:02.960354 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 29 11:03:02.961679 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 29 11:03:02.964127 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 29 11:03:02.972891 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 29 11:03:02.978346 dracut-cmdline[267]: dracut-dracut-053 Jan 29 11:03:02.981079 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 29 11:03:02.989323 dracut-cmdline[267]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=c8edc06d36325e34bb125a9ad39c4f788eb9f01102631b71efea3f9afa94c89e Jan 29 11:03:03.006592 systemd-resolved[281]: Positive Trust Anchors: Jan 29 11:03:03.006666 systemd-resolved[281]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 29 11:03:03.006697 systemd-resolved[281]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 29 11:03:03.016094 systemd-resolved[281]: Defaulting to hostname 'linux'. Jan 29 11:03:03.018031 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 29 11:03:03.019192 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 29 11:03:03.064878 kernel: SCSI subsystem initialized Jan 29 11:03:03.069878 kernel: Loading iSCSI transport class v2.0-870. Jan 29 11:03:03.076854 kernel: iscsi: registered transport (tcp) Jan 29 11:03:03.090870 kernel: iscsi: registered transport (qla4xxx) Jan 29 11:03:03.090950 kernel: QLogic iSCSI HBA Driver Jan 29 11:03:03.135309 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 29 11:03:03.140980 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 29 11:03:03.158904 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 29 11:03:03.159013 kernel: device-mapper: uevent: version 1.0.3 Jan 29 11:03:03.159042 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 29 11:03:03.207884 kernel: raid6: neonx8 gen() 15670 MB/s Jan 29 11:03:03.224857 kernel: raid6: neonx4 gen() 15555 MB/s Jan 29 11:03:03.241872 kernel: raid6: neonx2 gen() 13144 MB/s Jan 29 11:03:03.258873 kernel: raid6: neonx1 gen() 10438 MB/s Jan 29 11:03:03.275869 kernel: raid6: int64x8 gen() 6928 MB/s Jan 29 11:03:03.292863 kernel: raid6: int64x4 gen() 7311 MB/s Jan 29 11:03:03.309861 kernel: raid6: int64x2 gen() 6099 MB/s Jan 29 11:03:03.326882 kernel: raid6: int64x1 gen() 5027 MB/s Jan 29 11:03:03.326960 kernel: raid6: using algorithm neonx8 gen() 15670 MB/s Jan 29 11:03:03.343869 kernel: raid6: .... xor() 11849 MB/s, rmw enabled Jan 29 11:03:03.343959 kernel: raid6: using neon recovery algorithm Jan 29 11:03:03.348858 kernel: xor: measuring software checksum speed Jan 29 11:03:03.348917 kernel: 8regs : 19845 MB/sec Jan 29 11:03:03.348940 kernel: 32regs : 19650 MB/sec Jan 29 11:03:03.348961 kernel: arm64_neon : 23941 MB/sec Jan 29 11:03:03.349848 kernel: xor: using function: arm64_neon (23941 MB/sec) Jan 29 11:03:03.398886 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 29 11:03:03.414612 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 29 11:03:03.421027 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 29 11:03:03.434602 systemd-udevd[456]: Using default interface naming scheme 'v255'. Jan 29 11:03:03.437925 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 29 11:03:03.448151 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 29 11:03:03.462879 dracut-pre-trigger[463]: rd.md=0: removing MD RAID activation Jan 29 11:03:03.494302 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 29 11:03:03.506129 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 29 11:03:03.557880 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 29 11:03:03.567149 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 29 11:03:03.590378 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 29 11:03:03.593072 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 29 11:03:03.593757 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 29 11:03:03.595214 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 29 11:03:03.605125 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 29 11:03:03.620943 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 29 11:03:03.669001 kernel: scsi host0: Virtio SCSI HBA Jan 29 11:03:03.670367 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 29 11:03:03.670506 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 11:03:03.675497 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 29 11:03:03.675542 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 29 11:03:03.671521 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 29 11:03:03.672385 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 29 11:03:03.673507 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 11:03:03.674758 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 11:03:03.690212 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 11:03:03.695529 kernel: ACPI: bus type USB registered Jan 29 11:03:03.695551 kernel: usbcore: registered new interface driver usbfs Jan 29 11:03:03.699922 kernel: usbcore: registered new interface driver hub Jan 29 11:03:03.701841 kernel: usbcore: registered new device driver usb Jan 29 11:03:03.711895 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 11:03:03.716314 kernel: sr 0:0:0:0: Power-on or device reset occurred Jan 29 11:03:03.720536 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jan 29 11:03:03.720649 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 29 11:03:03.720660 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 29 11:03:03.718305 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 29 11:03:03.734859 kernel: sd 0:0:0:1: Power-on or device reset occurred Jan 29 11:03:03.746550 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jan 29 11:03:03.746686 kernel: sd 0:0:0:1: [sda] Write Protect is off Jan 29 11:03:03.746781 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jan 29 11:03:03.746905 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 29 11:03:03.746989 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 29 11:03:03.756435 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 29 11:03:03.756548 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 29 11:03:03.756629 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 29 11:03:03.756647 kernel: GPT:17805311 != 80003071 Jan 29 11:03:03.756657 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 29 11:03:03.756666 kernel: GPT:17805311 != 80003071 Jan 29 11:03:03.756675 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 29 11:03:03.756684 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 29 11:03:03.756693 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jan 29 11:03:03.756796 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 29 11:03:03.756913 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 29 11:03:03.756994 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 29 11:03:03.757076 kernel: hub 1-0:1.0: USB hub found Jan 29 11:03:03.757173 kernel: hub 1-0:1.0: 4 ports detected Jan 29 11:03:03.757273 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 29 11:03:03.757387 kernel: hub 2-0:1.0: USB hub found Jan 29 11:03:03.757479 kernel: hub 2-0:1.0: 4 ports detected Jan 29 11:03:03.745511 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 11:03:03.792844 kernel: BTRFS: device fsid b5bc7ecc-f31a-46c7-9582-5efca7819025 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (514) Jan 29 11:03:03.795848 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (508) Jan 29 11:03:03.802894 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 29 11:03:03.808466 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 29 11:03:03.815461 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 29 11:03:03.823475 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jan 29 11:03:03.824396 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 29 11:03:03.832049 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 29 11:03:03.838942 disk-uuid[573]: Primary Header is updated. Jan 29 11:03:03.838942 disk-uuid[573]: Secondary Entries is updated. Jan 29 11:03:03.838942 disk-uuid[573]: Secondary Header is updated. Jan 29 11:03:03.843845 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 29 11:03:03.994904 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 29 11:03:04.236943 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 29 11:03:04.374096 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 29 11:03:04.374152 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 29 11:03:04.375586 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 29 11:03:04.430022 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 29 11:03:04.430672 kernel: usbcore: registered new interface driver usbhid Jan 29 11:03:04.430695 kernel: usbhid: USB HID core driver Jan 29 11:03:04.856142 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 29 11:03:04.856196 disk-uuid[574]: The operation has completed successfully. Jan 29 11:03:04.909108 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 29 11:03:04.910009 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 29 11:03:04.923138 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 29 11:03:04.929174 sh[588]: Success Jan 29 11:03:04.940857 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Jan 29 11:03:05.000259 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 29 11:03:05.001872 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 29 11:03:05.004061 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 29 11:03:05.025272 kernel: BTRFS info (device dm-0): first mount of filesystem b5bc7ecc-f31a-46c7-9582-5efca7819025 Jan 29 11:03:05.025350 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 29 11:03:05.025373 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 29 11:03:05.025844 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 29 11:03:05.026901 kernel: BTRFS info (device dm-0): using free space tree Jan 29 11:03:05.032845 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 29 11:03:05.035008 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 29 11:03:05.037009 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 29 11:03:05.043065 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 29 11:03:05.048013 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 29 11:03:05.060589 kernel: BTRFS info (device sda6): first mount of filesystem 9c6de53f-d522-4994-b092-a63f342c3ab0 Jan 29 11:03:05.060638 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 29 11:03:05.060650 kernel: BTRFS info (device sda6): using free space tree Jan 29 11:03:05.063977 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 29 11:03:05.064035 kernel: BTRFS info (device sda6): auto enabling async discard Jan 29 11:03:05.076871 kernel: BTRFS info (device sda6): last unmount of filesystem 9c6de53f-d522-4994-b092-a63f342c3ab0 Jan 29 11:03:05.077295 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 29 11:03:05.082611 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 29 11:03:05.090067 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 29 11:03:05.180752 ignition[683]: Ignition 2.20.0 Jan 29 11:03:05.180766 ignition[683]: Stage: fetch-offline Jan 29 11:03:05.180801 ignition[683]: no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:05.180809 ignition[683]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:05.180982 ignition[683]: parsed url from cmdline: "" Jan 29 11:03:05.185104 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 29 11:03:05.180986 ignition[683]: no config URL provided Jan 29 11:03:05.180991 ignition[683]: reading system config file "/usr/lib/ignition/user.ign" Jan 29 11:03:05.180998 ignition[683]: no config at "/usr/lib/ignition/user.ign" Jan 29 11:03:05.181003 ignition[683]: failed to fetch config: resource requires networking Jan 29 11:03:05.181164 ignition[683]: Ignition finished successfully Jan 29 11:03:05.194059 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 29 11:03:05.202115 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 29 11:03:05.223393 systemd-networkd[777]: lo: Link UP Jan 29 11:03:05.223995 systemd-networkd[777]: lo: Gained carrier Jan 29 11:03:05.226030 systemd-networkd[777]: Enumeration completed Jan 29 11:03:05.226618 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 29 11:03:05.228033 systemd[1]: Reached target network.target - Network. Jan 29 11:03:05.228919 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:05.228922 systemd-networkd[777]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 29 11:03:05.229631 systemd-networkd[777]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:05.229634 systemd-networkd[777]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 29 11:03:05.230145 systemd-networkd[777]: eth0: Link UP Jan 29 11:03:05.230150 systemd-networkd[777]: eth0: Gained carrier Jan 29 11:03:05.230156 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:05.235057 systemd-networkd[777]: eth1: Link UP Jan 29 11:03:05.235061 systemd-networkd[777]: eth1: Gained carrier Jan 29 11:03:05.235068 systemd-networkd[777]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:05.236029 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 29 11:03:05.248508 ignition[779]: Ignition 2.20.0 Jan 29 11:03:05.248517 ignition[779]: Stage: fetch Jan 29 11:03:05.248704 ignition[779]: no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:05.248714 ignition[779]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:05.248803 ignition[779]: parsed url from cmdline: "" Jan 29 11:03:05.248807 ignition[779]: no config URL provided Jan 29 11:03:05.248829 ignition[779]: reading system config file "/usr/lib/ignition/user.ign" Jan 29 11:03:05.248837 ignition[779]: no config at "/usr/lib/ignition/user.ign" Jan 29 11:03:05.248924 ignition[779]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 29 11:03:05.249845 ignition[779]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jan 29 11:03:05.260920 systemd-networkd[777]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jan 29 11:03:05.294932 systemd-networkd[777]: eth0: DHCPv4 address 78.46.186.225/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 29 11:03:05.450694 ignition[779]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jan 29 11:03:05.454267 ignition[779]: GET result: OK Jan 29 11:03:05.454430 ignition[779]: parsing config with SHA512: 84d241b50db1bd99ff23825f22bb2da9d71ddb14165302089340cb74ddb108747132e36aa1b0e6f226113923db3cf543cc45d73773bd5a6e2fa107b476953ab5 Jan 29 11:03:05.460759 unknown[779]: fetched base config from "system" Jan 29 11:03:05.460788 unknown[779]: fetched base config from "system" Jan 29 11:03:05.461430 ignition[779]: fetch: fetch complete Jan 29 11:03:05.460795 unknown[779]: fetched user config from "hetzner" Jan 29 11:03:05.461437 ignition[779]: fetch: fetch passed Jan 29 11:03:05.464509 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 29 11:03:05.461497 ignition[779]: Ignition finished successfully Jan 29 11:03:05.474039 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 29 11:03:05.487046 ignition[787]: Ignition 2.20.0 Jan 29 11:03:05.487056 ignition[787]: Stage: kargs Jan 29 11:03:05.487260 ignition[787]: no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:05.487270 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:05.491790 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 29 11:03:05.488193 ignition[787]: kargs: kargs passed Jan 29 11:03:05.488255 ignition[787]: Ignition finished successfully Jan 29 11:03:05.501700 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 29 11:03:05.513186 ignition[793]: Ignition 2.20.0 Jan 29 11:03:05.513196 ignition[793]: Stage: disks Jan 29 11:03:05.513384 ignition[793]: no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:05.515962 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 29 11:03:05.513395 ignition[793]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:05.517463 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 29 11:03:05.514373 ignition[793]: disks: disks passed Jan 29 11:03:05.518673 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 29 11:03:05.514420 ignition[793]: Ignition finished successfully Jan 29 11:03:05.519905 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 29 11:03:05.520846 systemd[1]: Reached target sysinit.target - System Initialization. Jan 29 11:03:05.521633 systemd[1]: Reached target basic.target - Basic System. Jan 29 11:03:05.528037 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 29 11:03:05.543070 systemd-fsck[802]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 29 11:03:05.547185 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 29 11:03:05.553238 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 29 11:03:05.602865 kernel: EXT4-fs (sda9): mounted filesystem bd47c032-97f4-4b3a-b174-3601de374086 r/w with ordered data mode. Quota mode: none. Jan 29 11:03:05.604003 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 29 11:03:05.605192 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 29 11:03:05.619013 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 29 11:03:05.622743 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 29 11:03:05.630870 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (810) Jan 29 11:03:05.632015 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 29 11:03:05.634954 kernel: BTRFS info (device sda6): first mount of filesystem 9c6de53f-d522-4994-b092-a63f342c3ab0 Jan 29 11:03:05.634978 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 29 11:03:05.634988 kernel: BTRFS info (device sda6): using free space tree Jan 29 11:03:05.636318 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 29 11:03:05.640634 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 29 11:03:05.640662 kernel: BTRFS info (device sda6): auto enabling async discard Jan 29 11:03:05.636357 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 29 11:03:05.641170 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 29 11:03:05.644593 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 29 11:03:05.652088 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 29 11:03:05.687521 coreos-metadata[812]: Jan 29 11:03:05.687 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 29 11:03:05.689320 coreos-metadata[812]: Jan 29 11:03:05.689 INFO Fetch successful Jan 29 11:03:05.691862 coreos-metadata[812]: Jan 29 11:03:05.690 INFO wrote hostname ci-4152-2-0-b-6e231d00a9 to /sysroot/etc/hostname Jan 29 11:03:05.694630 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 29 11:03:05.704655 initrd-setup-root[838]: cut: /sysroot/etc/passwd: No such file or directory Jan 29 11:03:05.710281 initrd-setup-root[845]: cut: /sysroot/etc/group: No such file or directory Jan 29 11:03:05.715217 initrd-setup-root[852]: cut: /sysroot/etc/shadow: No such file or directory Jan 29 11:03:05.719857 initrd-setup-root[859]: cut: /sysroot/etc/gshadow: No such file or directory Jan 29 11:03:05.822537 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 29 11:03:05.826949 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 29 11:03:05.829705 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 29 11:03:05.838882 kernel: BTRFS info (device sda6): last unmount of filesystem 9c6de53f-d522-4994-b092-a63f342c3ab0 Jan 29 11:03:05.858972 ignition[927]: INFO : Ignition 2.20.0 Jan 29 11:03:05.858972 ignition[927]: INFO : Stage: mount Jan 29 11:03:05.860675 ignition[927]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:05.860675 ignition[927]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:05.860675 ignition[927]: INFO : mount: mount passed Jan 29 11:03:05.860675 ignition[927]: INFO : Ignition finished successfully Jan 29 11:03:05.861997 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 29 11:03:05.867004 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 29 11:03:05.868425 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 29 11:03:06.023797 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 29 11:03:06.036157 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 29 11:03:06.045144 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (938) Jan 29 11:03:06.045209 kernel: BTRFS info (device sda6): first mount of filesystem 9c6de53f-d522-4994-b092-a63f342c3ab0 Jan 29 11:03:06.045246 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 29 11:03:06.046104 kernel: BTRFS info (device sda6): using free space tree Jan 29 11:03:06.049372 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 29 11:03:06.049412 kernel: BTRFS info (device sda6): auto enabling async discard Jan 29 11:03:06.052607 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 29 11:03:06.073137 ignition[955]: INFO : Ignition 2.20.0 Jan 29 11:03:06.073137 ignition[955]: INFO : Stage: files Jan 29 11:03:06.075022 ignition[955]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:06.075022 ignition[955]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:06.077324 ignition[955]: DEBUG : files: compiled without relabeling support, skipping Jan 29 11:03:06.079114 ignition[955]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 29 11:03:06.079114 ignition[955]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 29 11:03:06.082782 ignition[955]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 29 11:03:06.084554 ignition[955]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 29 11:03:06.084554 ignition[955]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 29 11:03:06.083202 unknown[955]: wrote ssh authorized keys file for user: core Jan 29 11:03:06.088867 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jan 29 11:03:06.088867 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Jan 29 11:03:06.132294 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 29 11:03:06.314054 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jan 29 11:03:06.314054 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 29 11:03:06.316523 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Jan 29 11:03:06.913002 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 29 11:03:07.175147 systemd-networkd[777]: eth1: Gained IPv6LL Jan 29 11:03:07.176765 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Jan 29 11:03:07.176765 ignition[955]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 29 11:03:07.179139 ignition[955]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 29 11:03:07.179139 ignition[955]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 29 11:03:07.179139 ignition[955]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 29 11:03:07.179139 ignition[955]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 29 11:03:07.183865 ignition[955]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 29 11:03:07.183865 ignition[955]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 29 11:03:07.183865 ignition[955]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 29 11:03:07.183865 ignition[955]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 29 11:03:07.183865 ignition[955]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 29 11:03:07.183865 ignition[955]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 29 11:03:07.183865 ignition[955]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 29 11:03:07.183865 ignition[955]: INFO : files: files passed Jan 29 11:03:07.183865 ignition[955]: INFO : Ignition finished successfully Jan 29 11:03:07.181874 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 29 11:03:07.190134 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 29 11:03:07.192412 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 29 11:03:07.197393 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 29 11:03:07.197547 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 29 11:03:07.211866 initrd-setup-root-after-ignition[983]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 29 11:03:07.211866 initrd-setup-root-after-ignition[983]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 29 11:03:07.214368 initrd-setup-root-after-ignition[987]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 29 11:03:07.217074 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 29 11:03:07.217903 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 29 11:03:07.221074 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 29 11:03:07.239589 systemd-networkd[777]: eth0: Gained IPv6LL Jan 29 11:03:07.251109 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 29 11:03:07.251263 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 29 11:03:07.252880 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 29 11:03:07.254148 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 29 11:03:07.255455 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 29 11:03:07.265052 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 29 11:03:07.280889 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 29 11:03:07.287076 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 29 11:03:07.298094 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 29 11:03:07.299876 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 29 11:03:07.300575 systemd[1]: Stopped target timers.target - Timer Units. Jan 29 11:03:07.301544 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 29 11:03:07.301679 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 29 11:03:07.303123 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 29 11:03:07.303735 systemd[1]: Stopped target basic.target - Basic System. Jan 29 11:03:07.304887 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 29 11:03:07.306060 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 29 11:03:07.307045 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 29 11:03:07.308164 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 29 11:03:07.309204 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 29 11:03:07.310351 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 29 11:03:07.311336 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 29 11:03:07.312412 systemd[1]: Stopped target swap.target - Swaps. Jan 29 11:03:07.313278 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 29 11:03:07.313399 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 29 11:03:07.314636 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 29 11:03:07.315303 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 29 11:03:07.316339 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 29 11:03:07.319862 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 29 11:03:07.320692 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 29 11:03:07.320835 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 29 11:03:07.322957 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 29 11:03:07.323084 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 29 11:03:07.325177 systemd[1]: ignition-files.service: Deactivated successfully. Jan 29 11:03:07.325322 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 29 11:03:07.326350 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 29 11:03:07.326444 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 29 11:03:07.335588 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 29 11:03:07.341253 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 29 11:03:07.342910 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 29 11:03:07.343043 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 29 11:03:07.344986 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 29 11:03:07.345080 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 29 11:03:07.350028 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 29 11:03:07.350631 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 29 11:03:07.353404 ignition[1007]: INFO : Ignition 2.20.0 Jan 29 11:03:07.353404 ignition[1007]: INFO : Stage: umount Jan 29 11:03:07.353404 ignition[1007]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 29 11:03:07.353404 ignition[1007]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 29 11:03:07.355636 ignition[1007]: INFO : umount: umount passed Jan 29 11:03:07.355636 ignition[1007]: INFO : Ignition finished successfully Jan 29 11:03:07.365316 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 29 11:03:07.367117 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 29 11:03:07.373287 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 29 11:03:07.374313 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 29 11:03:07.375964 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 29 11:03:07.376007 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 29 11:03:07.378690 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 29 11:03:07.378736 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 29 11:03:07.379731 systemd[1]: Stopped target network.target - Network. Jan 29 11:03:07.381920 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 29 11:03:07.381997 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 29 11:03:07.387468 systemd[1]: Stopped target paths.target - Path Units. Jan 29 11:03:07.388360 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 29 11:03:07.388458 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 29 11:03:07.389567 systemd[1]: Stopped target slices.target - Slice Units. Jan 29 11:03:07.393249 systemd[1]: Stopped target sockets.target - Socket Units. Jan 29 11:03:07.393927 systemd[1]: iscsid.socket: Deactivated successfully. Jan 29 11:03:07.393977 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 29 11:03:07.401379 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 29 11:03:07.401447 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 29 11:03:07.402893 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 29 11:03:07.402996 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 29 11:03:07.406946 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 29 11:03:07.407014 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 29 11:03:07.408642 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 29 11:03:07.412055 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 29 11:03:07.413656 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 29 11:03:07.415014 systemd-networkd[777]: eth1: DHCPv6 lease lost Jan 29 11:03:07.415802 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 29 11:03:07.415910 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 29 11:03:07.417746 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 29 11:03:07.417914 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 29 11:03:07.421366 systemd-networkd[777]: eth0: DHCPv6 lease lost Jan 29 11:03:07.422932 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 29 11:03:07.423064 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 29 11:03:07.426151 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 29 11:03:07.426292 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 29 11:03:07.428139 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 29 11:03:07.428194 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 29 11:03:07.433993 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 29 11:03:07.434488 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 29 11:03:07.434547 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 29 11:03:07.436131 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 29 11:03:07.436171 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 29 11:03:07.437880 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 29 11:03:07.437924 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 29 11:03:07.438483 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 29 11:03:07.438519 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 29 11:03:07.439271 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 29 11:03:07.452800 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 29 11:03:07.452931 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 29 11:03:07.454967 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 29 11:03:07.455105 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 29 11:03:07.456366 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 29 11:03:07.456407 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 29 11:03:07.457084 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 29 11:03:07.457115 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 29 11:03:07.458268 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 29 11:03:07.458316 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 29 11:03:07.459632 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 29 11:03:07.459672 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 29 11:03:07.461099 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 29 11:03:07.461146 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 11:03:07.468200 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 29 11:03:07.468788 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 29 11:03:07.468865 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 29 11:03:07.469520 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 29 11:03:07.469557 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 29 11:03:07.470210 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 29 11:03:07.470255 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 29 11:03:07.471555 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 29 11:03:07.471599 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 11:03:07.477247 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 29 11:03:07.477377 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 29 11:03:07.478490 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 29 11:03:07.488268 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 29 11:03:07.497808 systemd[1]: Switching root. Jan 29 11:03:07.526990 systemd-journald[236]: Journal stopped Jan 29 11:03:08.359839 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Jan 29 11:03:08.359907 kernel: SELinux: policy capability network_peer_controls=1 Jan 29 11:03:08.359927 kernel: SELinux: policy capability open_perms=1 Jan 29 11:03:08.359937 kernel: SELinux: policy capability extended_socket_class=1 Jan 29 11:03:08.359947 kernel: SELinux: policy capability always_check_network=0 Jan 29 11:03:08.359960 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 29 11:03:08.359971 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 29 11:03:08.359980 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 29 11:03:08.359990 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 29 11:03:08.360000 kernel: audit: type=1403 audit(1738148587.638:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 29 11:03:08.360011 systemd[1]: Successfully loaded SELinux policy in 33.977ms. Jan 29 11:03:08.360031 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.284ms. Jan 29 11:03:08.360042 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 29 11:03:08.360055 systemd[1]: Detected virtualization kvm. Jan 29 11:03:08.360069 systemd[1]: Detected architecture arm64. Jan 29 11:03:08.360081 systemd[1]: Detected first boot. Jan 29 11:03:08.360091 systemd[1]: Hostname set to . Jan 29 11:03:08.360102 systemd[1]: Initializing machine ID from VM UUID. Jan 29 11:03:08.360113 zram_generator::config[1049]: No configuration found. Jan 29 11:03:08.360125 systemd[1]: Populated /etc with preset unit settings. Jan 29 11:03:08.360136 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 29 11:03:08.360148 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 29 11:03:08.360158 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 29 11:03:08.360170 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 29 11:03:08.360180 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 29 11:03:08.360191 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 29 11:03:08.360202 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 29 11:03:08.360218 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 29 11:03:08.360230 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 29 11:03:08.360242 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 29 11:03:08.360253 systemd[1]: Created slice user.slice - User and Session Slice. Jan 29 11:03:08.360264 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 29 11:03:08.360275 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 29 11:03:08.360286 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 29 11:03:08.360297 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 29 11:03:08.360308 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 29 11:03:08.360318 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 29 11:03:08.360329 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 29 11:03:08.360341 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 29 11:03:08.360352 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 29 11:03:08.360363 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 29 11:03:08.360377 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 29 11:03:08.360391 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 29 11:03:08.360402 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 29 11:03:08.360414 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 29 11:03:08.360425 systemd[1]: Reached target slices.target - Slice Units. Jan 29 11:03:08.360436 systemd[1]: Reached target swap.target - Swaps. Jan 29 11:03:08.360446 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 29 11:03:08.360457 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 29 11:03:08.360468 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 29 11:03:08.360479 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 29 11:03:08.360491 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 29 11:03:08.360501 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 29 11:03:08.360512 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 29 11:03:08.360526 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 29 11:03:08.360536 systemd[1]: Mounting media.mount - External Media Directory... Jan 29 11:03:08.360547 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 29 11:03:08.360558 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 29 11:03:08.360569 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 29 11:03:08.360580 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 29 11:03:08.360591 systemd[1]: Reached target machines.target - Containers. Jan 29 11:03:08.360606 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 29 11:03:08.360619 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 29 11:03:08.360630 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 29 11:03:08.360642 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 29 11:03:08.360653 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 29 11:03:08.360664 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 29 11:03:08.360675 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 29 11:03:08.360687 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 29 11:03:08.360698 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 29 11:03:08.360713 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 29 11:03:08.360724 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 29 11:03:08.360735 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 29 11:03:08.360746 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 29 11:03:08.360757 systemd[1]: Stopped systemd-fsck-usr.service. Jan 29 11:03:08.360771 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 29 11:03:08.360783 kernel: fuse: init (API version 7.39) Jan 29 11:03:08.360793 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 29 11:03:08.360804 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 29 11:03:08.364978 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 29 11:03:08.365008 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 29 11:03:08.365020 systemd[1]: verity-setup.service: Deactivated successfully. Jan 29 11:03:08.365032 systemd[1]: Stopped verity-setup.service. Jan 29 11:03:08.365050 kernel: loop: module loaded Jan 29 11:03:08.365061 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 29 11:03:08.365072 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 29 11:03:08.365083 systemd[1]: Mounted media.mount - External Media Directory. Jan 29 11:03:08.365093 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 29 11:03:08.365104 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 29 11:03:08.365115 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 29 11:03:08.365127 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 29 11:03:08.365137 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 29 11:03:08.365148 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 29 11:03:08.365159 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 29 11:03:08.365170 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 29 11:03:08.365180 kernel: ACPI: bus type drm_connector registered Jan 29 11:03:08.365190 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 29 11:03:08.365201 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 29 11:03:08.365250 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 29 11:03:08.365263 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 29 11:03:08.365275 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 29 11:03:08.365286 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 29 11:03:08.365323 systemd-journald[1116]: Collecting audit messages is disabled. Jan 29 11:03:08.365349 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 29 11:03:08.365361 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 29 11:03:08.365371 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 29 11:03:08.365382 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 29 11:03:08.365393 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 29 11:03:08.365404 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 29 11:03:08.365415 systemd-journald[1116]: Journal started Jan 29 11:03:08.365439 systemd-journald[1116]: Runtime Journal (/run/log/journal/96563e3e90a54625af3538fa1d80f21c) is 8.0M, max 76.6M, 68.6M free. Jan 29 11:03:08.105479 systemd[1]: Queued start job for default target multi-user.target. Jan 29 11:03:08.133459 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 29 11:03:08.134082 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 29 11:03:08.373620 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 29 11:03:08.388323 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 29 11:03:08.388432 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 29 11:03:08.396038 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 29 11:03:08.396122 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 29 11:03:08.412844 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 29 11:03:08.421997 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 29 11:03:08.422071 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 29 11:03:08.426896 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 29 11:03:08.426967 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 29 11:03:08.431843 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 29 11:03:08.434843 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 29 11:03:08.437843 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 29 11:03:08.442914 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 29 11:03:08.450848 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 29 11:03:08.455926 systemd[1]: Started systemd-journald.service - Journal Service. Jan 29 11:03:08.465306 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 29 11:03:08.481452 kernel: loop0: detected capacity change from 0 to 116808 Jan 29 11:03:08.470873 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 29 11:03:08.471693 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 29 11:03:08.472528 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 29 11:03:08.474303 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 29 11:03:08.479307 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 29 11:03:08.503679 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 29 11:03:08.515250 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 29 11:03:08.524840 kernel: loop1: detected capacity change from 0 to 113536 Jan 29 11:03:08.527093 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 29 11:03:08.532197 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 29 11:03:08.543018 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 29 11:03:08.545237 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 29 11:03:08.556939 systemd-journald[1116]: Time spent on flushing to /var/log/journal/96563e3e90a54625af3538fa1d80f21c is 68.977ms for 1137 entries. Jan 29 11:03:08.556939 systemd-journald[1116]: System Journal (/var/log/journal/96563e3e90a54625af3538fa1d80f21c) is 8.0M, max 584.8M, 576.8M free. Jan 29 11:03:08.636977 systemd-journald[1116]: Received client request to flush runtime journal. Jan 29 11:03:08.637023 kernel: loop2: detected capacity change from 0 to 8 Jan 29 11:03:08.637040 kernel: loop3: detected capacity change from 0 to 194096 Jan 29 11:03:08.566502 systemd-tmpfiles[1146]: ACLs are not supported, ignoring. Jan 29 11:03:08.566513 systemd-tmpfiles[1146]: ACLs are not supported, ignoring. Jan 29 11:03:08.580264 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 29 11:03:08.601958 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 29 11:03:08.607006 udevadm[1177]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Jan 29 11:03:08.617311 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 29 11:03:08.619083 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 29 11:03:08.641744 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 29 11:03:08.666154 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 29 11:03:08.675972 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 29 11:03:08.691562 kernel: loop4: detected capacity change from 0 to 116808 Jan 29 11:03:08.702876 kernel: loop5: detected capacity change from 0 to 113536 Jan 29 11:03:08.712030 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Jan 29 11:03:08.712528 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Jan 29 11:03:08.715018 kernel: loop6: detected capacity change from 0 to 8 Jan 29 11:03:08.718520 kernel: loop7: detected capacity change from 0 to 194096 Jan 29 11:03:08.723377 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 29 11:03:08.739992 (sd-merge)[1191]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jan 29 11:03:08.740461 (sd-merge)[1191]: Merged extensions into '/usr'. Jan 29 11:03:08.746582 systemd[1]: Reloading requested from client PID 1145 ('systemd-sysext') (unit systemd-sysext.service)... Jan 29 11:03:08.746599 systemd[1]: Reloading... Jan 29 11:03:08.845841 zram_generator::config[1218]: No configuration found. Jan 29 11:03:08.980885 ldconfig[1141]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 29 11:03:09.015269 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 11:03:09.059734 systemd[1]: Reloading finished in 311 ms. Jan 29 11:03:09.084496 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 29 11:03:09.089016 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 29 11:03:09.097499 systemd[1]: Starting ensure-sysext.service... Jan 29 11:03:09.101133 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 29 11:03:09.108747 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 29 11:03:09.119046 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 29 11:03:09.124963 systemd[1]: Reloading requested from client PID 1258 ('systemctl') (unit ensure-sysext.service)... Jan 29 11:03:09.124981 systemd[1]: Reloading... Jan 29 11:03:09.139373 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 29 11:03:09.139696 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 29 11:03:09.141115 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 29 11:03:09.141716 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Jan 29 11:03:09.141943 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Jan 29 11:03:09.145906 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Jan 29 11:03:09.146059 systemd-tmpfiles[1260]: Skipping /boot Jan 29 11:03:09.155064 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Jan 29 11:03:09.155077 systemd-tmpfiles[1260]: Skipping /boot Jan 29 11:03:09.162172 systemd-udevd[1262]: Using default interface naming scheme 'v255'. Jan 29 11:03:09.224288 zram_generator::config[1285]: No configuration found. Jan 29 11:03:09.372844 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1312) Jan 29 11:03:09.407631 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 11:03:09.425836 kernel: mousedev: PS/2 mouse device common for all mice Jan 29 11:03:09.476971 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 29 11:03:09.477053 systemd[1]: Reloading finished in 351 ms. Jan 29 11:03:09.510325 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 29 11:03:09.514271 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 29 11:03:09.529952 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jan 29 11:03:09.530010 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 29 11:03:09.530085 kernel: [drm] features: -context_init Jan 29 11:03:09.531039 kernel: [drm] number of scanouts: 1 Jan 29 11:03:09.531077 kernel: [drm] number of cap sets: 0 Jan 29 11:03:09.534831 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Jan 29 11:03:09.538093 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 29 11:03:09.543837 kernel: Console: switching to colour frame buffer device 160x50 Jan 29 11:03:09.543908 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 29 11:03:09.548855 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 29 11:03:09.557134 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 29 11:03:09.560368 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 29 11:03:09.561970 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 29 11:03:09.564228 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 29 11:03:09.573092 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 29 11:03:09.575892 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 29 11:03:09.580055 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 29 11:03:09.582883 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 29 11:03:09.585051 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 29 11:03:09.594237 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 29 11:03:09.598306 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 29 11:03:09.608077 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 29 11:03:09.613127 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 29 11:03:09.619155 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 11:03:09.621369 systemd[1]: Finished ensure-sysext.service. Jan 29 11:03:09.623320 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 29 11:03:09.623459 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 29 11:03:09.625481 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 29 11:03:09.626014 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 29 11:03:09.627997 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 29 11:03:09.628129 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 29 11:03:09.634379 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 29 11:03:09.634512 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 29 11:03:09.644539 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 29 11:03:09.644669 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 29 11:03:09.657115 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 29 11:03:09.663031 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 29 11:03:09.666622 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 29 11:03:09.667631 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 29 11:03:09.668554 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 29 11:03:09.668702 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 11:03:09.678174 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 29 11:03:09.689361 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 29 11:03:09.692195 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 29 11:03:09.704965 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 29 11:03:09.705541 augenrules[1416]: No rules Jan 29 11:03:09.708012 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 29 11:03:09.711007 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 11:03:09.711778 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 29 11:03:09.717732 systemd[1]: audit-rules.service: Deactivated successfully. Jan 29 11:03:09.717942 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 29 11:03:09.721639 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 29 11:03:09.726405 lvm[1414]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 29 11:03:09.749437 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 29 11:03:09.753132 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 29 11:03:09.754465 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 29 11:03:09.764317 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 29 11:03:09.775333 lvm[1434]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 29 11:03:09.811018 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 29 11:03:09.826033 systemd-networkd[1383]: lo: Link UP Jan 29 11:03:09.826049 systemd-networkd[1383]: lo: Gained carrier Jan 29 11:03:09.827727 systemd-networkd[1383]: Enumeration completed Jan 29 11:03:09.828278 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 29 11:03:09.829679 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 11:03:09.835026 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 29 11:03:09.835973 systemd-networkd[1383]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:09.835987 systemd-networkd[1383]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 29 11:03:09.837928 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 29 11:03:09.838596 systemd[1]: Reached target time-set.target - System Time Set. Jan 29 11:03:09.839480 systemd-networkd[1383]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:09.839489 systemd-networkd[1383]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 29 11:03:09.840056 systemd-networkd[1383]: eth0: Link UP Jan 29 11:03:09.840064 systemd-networkd[1383]: eth0: Gained carrier Jan 29 11:03:09.840076 systemd-networkd[1383]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:09.843060 systemd-networkd[1383]: eth1: Link UP Jan 29 11:03:09.843070 systemd-networkd[1383]: eth1: Gained carrier Jan 29 11:03:09.843083 systemd-networkd[1383]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 11:03:09.849227 systemd-resolved[1384]: Positive Trust Anchors: Jan 29 11:03:09.849545 systemd-resolved[1384]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 29 11:03:09.849581 systemd-resolved[1384]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 29 11:03:09.853658 systemd-resolved[1384]: Using system hostname 'ci-4152-2-0-b-6e231d00a9'. Jan 29 11:03:09.855549 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 29 11:03:09.857120 systemd[1]: Reached target network.target - Network. Jan 29 11:03:09.858144 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 29 11:03:09.859529 systemd[1]: Reached target sysinit.target - System Initialization. Jan 29 11:03:09.860245 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 29 11:03:09.860951 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 29 11:03:09.861850 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 29 11:03:09.862517 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 29 11:03:09.863199 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 29 11:03:09.863935 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 29 11:03:09.863969 systemd[1]: Reached target paths.target - Path Units. Jan 29 11:03:09.864589 systemd[1]: Reached target timers.target - Timer Units. Jan 29 11:03:09.866124 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 29 11:03:09.868024 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 29 11:03:09.868872 systemd-networkd[1383]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jan 29 11:03:09.878367 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Jan 29 11:03:09.881951 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 29 11:03:09.883653 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 29 11:03:09.885258 systemd[1]: Reached target sockets.target - Socket Units. Jan 29 11:03:09.886411 systemd[1]: Reached target basic.target - Basic System. Jan 29 11:03:09.887482 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 29 11:03:09.887530 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 29 11:03:09.892969 systemd[1]: Starting containerd.service - containerd container runtime... Jan 29 11:03:09.895154 systemd-networkd[1383]: eth0: DHCPv4 address 78.46.186.225/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 29 11:03:09.896971 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 29 11:03:09.898145 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Jan 29 11:03:09.900035 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 29 11:03:09.904261 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 29 11:03:09.906013 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 29 11:03:09.908916 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 29 11:03:09.910450 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 29 11:03:09.912110 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 29 11:03:09.915995 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 29 11:03:09.925008 jq[1447]: false Jan 29 11:03:09.920619 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 29 11:03:09.928119 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 29 11:03:09.940965 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 29 11:03:09.942274 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 29 11:03:09.942725 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 29 11:03:09.944510 systemd[1]: Starting update-engine.service - Update Engine... Jan 29 11:03:09.948528 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 29 11:03:09.956190 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 29 11:03:09.956257 dbus-daemon[1446]: [system] SELinux support is enabled Jan 29 11:03:09.956366 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 29 11:03:09.956502 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 29 11:03:09.973320 coreos-metadata[1445]: Jan 29 11:03:09.969 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 29 11:03:09.973320 coreos-metadata[1445]: Jan 29 11:03:09.971 INFO Fetch successful Jan 29 11:03:09.973320 coreos-metadata[1445]: Jan 29 11:03:09.971 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 29 11:03:09.973320 coreos-metadata[1445]: Jan 29 11:03:09.972 INFO Fetch successful Jan 29 11:03:09.995336 systemd[1]: motdgen.service: Deactivated successfully. Jan 29 11:03:09.996996 extend-filesystems[1448]: Found loop4 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found loop5 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found loop6 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found loop7 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda1 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda2 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda3 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found usr Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda4 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda6 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda7 Jan 29 11:03:09.996996 extend-filesystems[1448]: Found sda9 Jan 29 11:03:09.996996 extend-filesystems[1448]: Checking size of /dev/sda9 Jan 29 11:03:10.056979 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jan 29 11:03:09.995546 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 29 11:03:10.058004 update_engine[1461]: I20250129 11:03:10.034296 1461 main.cc:92] Flatcar Update Engine starting Jan 29 11:03:10.058004 update_engine[1461]: I20250129 11:03:10.044795 1461 update_check_scheduler.cc:74] Next update check in 10m39s Jan 29 11:03:10.062078 extend-filesystems[1448]: Resized partition /dev/sda9 Jan 29 11:03:10.063043 jq[1462]: true Jan 29 11:03:09.999025 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 29 11:03:10.066161 extend-filesystems[1485]: resize2fs 1.47.1 (20-May-2024) Jan 29 11:03:09.999072 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 29 11:03:10.071183 tar[1467]: linux-arm64/helm Jan 29 11:03:10.001766 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 29 11:03:10.072736 jq[1484]: true Jan 29 11:03:10.001788 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 29 11:03:10.008078 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 29 11:03:10.008594 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 29 11:03:10.041641 (ntainerd)[1478]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 29 11:03:10.044261 systemd[1]: Started update-engine.service - Update Engine. Jan 29 11:03:10.049985 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 29 11:03:10.101790 systemd-logind[1458]: New seat seat0. Jan 29 11:03:10.105253 systemd-logind[1458]: Watching system buttons on /dev/input/event0 (Power Button) Jan 29 11:03:10.105269 systemd-logind[1458]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 29 11:03:10.118978 systemd[1]: Started systemd-logind.service - User Login Management. Jan 29 11:03:10.162315 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 29 11:03:10.166533 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 29 11:03:10.174830 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jan 29 11:03:10.183581 extend-filesystems[1485]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 29 11:03:10.183581 extend-filesystems[1485]: old_desc_blocks = 1, new_desc_blocks = 5 Jan 29 11:03:10.183581 extend-filesystems[1485]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jan 29 11:03:10.203675 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1300) Jan 29 11:03:10.185433 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 29 11:03:10.203755 bash[1513]: Updated "/home/core/.ssh/authorized_keys" Jan 29 11:03:10.203862 extend-filesystems[1448]: Resized filesystem in /dev/sda9 Jan 29 11:03:10.203862 extend-filesystems[1448]: Found sr0 Jan 29 11:03:10.187243 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 29 11:03:10.196970 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 29 11:03:10.223903 systemd[1]: Starting sshkeys.service... Jan 29 11:03:10.244069 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 29 11:03:10.250158 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 29 11:03:10.331214 coreos-metadata[1524]: Jan 29 11:03:10.330 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 29 11:03:10.335670 coreos-metadata[1524]: Jan 29 11:03:10.334 INFO Fetch successful Jan 29 11:03:10.336616 unknown[1524]: wrote ssh authorized keys file for user: core Jan 29 11:03:10.361615 locksmithd[1490]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 29 11:03:10.376607 update-ssh-keys[1531]: Updated "/home/core/.ssh/authorized_keys" Jan 29 11:03:10.378969 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 29 11:03:10.383895 systemd[1]: Finished sshkeys.service. Jan 29 11:03:10.428568 containerd[1478]: time="2025-01-29T11:03:10.428433160Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 29 11:03:10.499276 containerd[1478]: time="2025-01-29T11:03:10.498946720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506029480Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506073920Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506094720Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506314600Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506334440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506395120Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506407440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506562480Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506576240Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506589240Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506612 containerd[1478]: time="2025-01-29T11:03:10.506597960Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506909 containerd[1478]: time="2025-01-29T11:03:10.506661280Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.506930 containerd[1478]: time="2025-01-29T11:03:10.506905800Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 29 11:03:10.507103 containerd[1478]: time="2025-01-29T11:03:10.507001840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 11:03:10.507103 containerd[1478]: time="2025-01-29T11:03:10.507023840Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 29 11:03:10.507155 containerd[1478]: time="2025-01-29T11:03:10.507103800Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 29 11:03:10.507155 containerd[1478]: time="2025-01-29T11:03:10.507142320Z" level=info msg="metadata content store policy set" policy=shared Jan 29 11:03:10.519570 containerd[1478]: time="2025-01-29T11:03:10.519324560Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 29 11:03:10.519570 containerd[1478]: time="2025-01-29T11:03:10.519405920Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 29 11:03:10.519570 containerd[1478]: time="2025-01-29T11:03:10.519421440Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 29 11:03:10.519570 containerd[1478]: time="2025-01-29T11:03:10.519438440Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 29 11:03:10.519570 containerd[1478]: time="2025-01-29T11:03:10.519460400Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 29 11:03:10.519762 containerd[1478]: time="2025-01-29T11:03:10.519632160Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.519931360Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520038240Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520054320Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520068440Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520081920Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520094520Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520108000Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520122280Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520137440Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520152160Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520164 containerd[1478]: time="2025-01-29T11:03:10.520165920Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520179640Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520239280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520256400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520268840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520281480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520293840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520307080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520318080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520331360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520343680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520357880Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520369080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520379840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520391840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520441 containerd[1478]: time="2025-01-29T11:03:10.520405880Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520426480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520438960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520458080Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520632640Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520652200Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520662200Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520673640Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520687280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520701440Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520711680Z" level=info msg="NRI interface is disabled by configuration." Jan 29 11:03:10.520716 containerd[1478]: time="2025-01-29T11:03:10.520721480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 29 11:03:10.523224 containerd[1478]: time="2025-01-29T11:03:10.522081000Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 29 11:03:10.523224 containerd[1478]: time="2025-01-29T11:03:10.522139520Z" level=info msg="Connect containerd service" Jan 29 11:03:10.523224 containerd[1478]: time="2025-01-29T11:03:10.522176280Z" level=info msg="using legacy CRI server" Jan 29 11:03:10.523224 containerd[1478]: time="2025-01-29T11:03:10.522183080Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 29 11:03:10.523224 containerd[1478]: time="2025-01-29T11:03:10.522533320Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.524309160Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.524834960Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.524881440Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.524972120Z" level=info msg="Start subscribing containerd event" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.525008040Z" level=info msg="Start recovering state" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.525070520Z" level=info msg="Start event monitor" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.525086480Z" level=info msg="Start snapshots syncer" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.525097040Z" level=info msg="Start cni network conf syncer for default" Jan 29 11:03:10.527665 containerd[1478]: time="2025-01-29T11:03:10.525104480Z" level=info msg="Start streaming server" Jan 29 11:03:10.525570 systemd[1]: Started containerd.service - containerd container runtime. Jan 29 11:03:10.529986 containerd[1478]: time="2025-01-29T11:03:10.529956080Z" level=info msg="containerd successfully booted in 0.103921s" Jan 29 11:03:10.689675 tar[1467]: linux-arm64/LICENSE Jan 29 11:03:10.689897 tar[1467]: linux-arm64/README.md Jan 29 11:03:10.702067 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 29 11:03:10.826211 sshd_keygen[1489]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 29 11:03:10.847017 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 29 11:03:10.861263 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 29 11:03:10.873167 systemd[1]: issuegen.service: Deactivated successfully. Jan 29 11:03:10.874917 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 29 11:03:10.881367 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 29 11:03:10.899801 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 29 11:03:10.909257 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 29 11:03:10.912242 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 29 11:03:10.913735 systemd[1]: Reached target getty.target - Login Prompts. Jan 29 11:03:11.142997 systemd-networkd[1383]: eth0: Gained IPv6LL Jan 29 11:03:11.144163 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Jan 29 11:03:11.149912 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 29 11:03:11.152292 systemd[1]: Reached target network-online.target - Network is Online. Jan 29 11:03:11.164134 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:03:11.167090 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 29 11:03:11.192566 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 29 11:03:11.719461 systemd-networkd[1383]: eth1: Gained IPv6LL Jan 29 11:03:11.720698 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Jan 29 11:03:11.833970 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:03:11.835786 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 29 11:03:11.838293 systemd[1]: Startup finished in 775ms (kernel) + 4.934s (initrd) + 4.233s (userspace) = 9.942s. Jan 29 11:03:11.849587 (kubelet)[1575]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:03:12.414307 kubelet[1575]: E0129 11:03:12.414229 1575 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:03:12.418042 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:03:12.418243 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:03:22.668935 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 29 11:03:22.682176 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:03:22.777266 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:03:22.782937 (kubelet)[1595]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:03:22.835880 kubelet[1595]: E0129 11:03:22.835789 1595 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:03:22.839259 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:03:22.839535 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:03:32.915893 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 29 11:03:32.923158 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:03:33.026111 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:03:33.037464 (kubelet)[1611]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:03:33.096292 kubelet[1611]: E0129 11:03:33.096227 1611 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:03:33.099214 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:03:33.099364 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:03:41.951442 systemd-timesyncd[1396]: Contacted time server 217.91.44.17:123 (2.flatcar.pool.ntp.org). Jan 29 11:03:41.951518 systemd-timesyncd[1396]: Initial clock synchronization to Wed 2025-01-29 11:03:41.883707 UTC. Jan 29 11:03:43.166120 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 29 11:03:43.173012 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:03:43.280944 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:03:43.294337 (kubelet)[1628]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:03:43.343346 kubelet[1628]: E0129 11:03:43.343271 1628 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:03:43.347976 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:03:43.348206 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:03:53.416294 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 29 11:03:53.422122 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:03:53.536197 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:03:53.536759 (kubelet)[1645]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:03:53.583838 kubelet[1645]: E0129 11:03:53.583757 1645 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:03:53.587056 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:03:53.587258 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:03:54.940904 update_engine[1461]: I20250129 11:03:54.940505 1461 update_attempter.cc:509] Updating boot flags... Jan 29 11:03:54.989008 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1662) Jan 29 11:03:55.034850 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1661) Jan 29 11:04:03.666380 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 29 11:04:03.677193 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:04:03.776106 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:04:03.780718 (kubelet)[1679]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:04:03.828773 kubelet[1679]: E0129 11:04:03.828726 1679 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:04:03.831648 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:04:03.831861 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:04:13.916308 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jan 29 11:04:13.927195 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:04:14.055140 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:04:14.055326 (kubelet)[1695]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:04:14.102760 kubelet[1695]: E0129 11:04:14.102703 1695 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:04:14.105614 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:04:14.105910 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:04:24.166203 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Jan 29 11:04:24.178196 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:04:24.292786 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:04:24.297496 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:04:24.351768 kubelet[1711]: E0129 11:04:24.351704 1711 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:04:24.355341 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:04:24.355657 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:04:34.416351 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Jan 29 11:04:34.423052 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:04:34.530030 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:04:34.535669 (kubelet)[1727]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:04:34.580134 kubelet[1727]: E0129 11:04:34.580063 1727 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:04:34.582477 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:04:34.582623 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:04:44.666341 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Jan 29 11:04:44.672161 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:04:44.794609 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:04:44.806385 (kubelet)[1743]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:04:44.849660 kubelet[1743]: E0129 11:04:44.849602 1743 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:04:44.852444 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:04:44.852586 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:04:54.916703 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Jan 29 11:04:54.927251 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:04:55.093319 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:04:55.093618 (kubelet)[1759]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:04:55.150049 kubelet[1759]: E0129 11:04:55.149989 1759 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:04:55.154030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:04:55.154214 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:05:02.760114 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 29 11:05:02.771377 systemd[1]: Started sshd@0-78.46.186.225:22-147.75.109.163:42818.service - OpenSSH per-connection server daemon (147.75.109.163:42818). Jan 29 11:05:03.769350 sshd[1768]: Accepted publickey for core from 147.75.109.163 port 42818 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:03.770552 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:03.784469 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 29 11:05:03.794226 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 29 11:05:03.797969 systemd-logind[1458]: New session 1 of user core. Jan 29 11:05:03.811713 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 29 11:05:03.821528 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 29 11:05:03.838015 (systemd)[1772]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 29 11:05:03.962028 systemd[1772]: Queued start job for default target default.target. Jan 29 11:05:03.971086 systemd[1772]: Created slice app.slice - User Application Slice. Jan 29 11:05:03.971149 systemd[1772]: Reached target paths.target - Paths. Jan 29 11:05:03.971179 systemd[1772]: Reached target timers.target - Timers. Jan 29 11:05:03.973901 systemd[1772]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 29 11:05:03.990474 systemd[1772]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 29 11:05:03.990687 systemd[1772]: Reached target sockets.target - Sockets. Jan 29 11:05:03.990718 systemd[1772]: Reached target basic.target - Basic System. Jan 29 11:05:03.990795 systemd[1772]: Reached target default.target - Main User Target. Jan 29 11:05:03.991088 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 29 11:05:03.992521 systemd[1772]: Startup finished in 146ms. Jan 29 11:05:03.999079 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 29 11:05:04.689228 systemd[1]: Started sshd@1-78.46.186.225:22-147.75.109.163:42826.service - OpenSSH per-connection server daemon (147.75.109.163:42826). Jan 29 11:05:05.165912 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Jan 29 11:05:05.178340 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:05.298470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:05.305280 (kubelet)[1793]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:05:05.357439 kubelet[1793]: E0129 11:05:05.357373 1793 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:05:05.360608 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:05:05.361130 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:05:05.669874 sshd[1783]: Accepted publickey for core from 147.75.109.163 port 42826 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:05.671891 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:05.680144 systemd-logind[1458]: New session 2 of user core. Jan 29 11:05:05.687241 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 29 11:05:06.346136 sshd[1801]: Connection closed by 147.75.109.163 port 42826 Jan 29 11:05:06.346901 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Jan 29 11:05:06.351609 systemd[1]: sshd@1-78.46.186.225:22-147.75.109.163:42826.service: Deactivated successfully. Jan 29 11:05:06.353756 systemd[1]: session-2.scope: Deactivated successfully. Jan 29 11:05:06.354507 systemd-logind[1458]: Session 2 logged out. Waiting for processes to exit. Jan 29 11:05:06.355467 systemd-logind[1458]: Removed session 2. Jan 29 11:05:06.523161 systemd[1]: Started sshd@2-78.46.186.225:22-147.75.109.163:42842.service - OpenSSH per-connection server daemon (147.75.109.163:42842). Jan 29 11:05:07.511891 sshd[1806]: Accepted publickey for core from 147.75.109.163 port 42842 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:07.513757 sshd-session[1806]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:07.518970 systemd-logind[1458]: New session 3 of user core. Jan 29 11:05:07.527133 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 29 11:05:08.191909 sshd[1808]: Connection closed by 147.75.109.163 port 42842 Jan 29 11:05:08.192581 sshd-session[1806]: pam_unix(sshd:session): session closed for user core Jan 29 11:05:08.197472 systemd-logind[1458]: Session 3 logged out. Waiting for processes to exit. Jan 29 11:05:08.197607 systemd[1]: sshd@2-78.46.186.225:22-147.75.109.163:42842.service: Deactivated successfully. Jan 29 11:05:08.199356 systemd[1]: session-3.scope: Deactivated successfully. Jan 29 11:05:08.200345 systemd-logind[1458]: Removed session 3. Jan 29 11:05:08.366274 systemd[1]: Started sshd@3-78.46.186.225:22-147.75.109.163:50072.service - OpenSSH per-connection server daemon (147.75.109.163:50072). Jan 29 11:05:09.349879 sshd[1813]: Accepted publickey for core from 147.75.109.163 port 50072 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:09.351654 sshd-session[1813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:09.356628 systemd-logind[1458]: New session 4 of user core. Jan 29 11:05:09.365124 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 29 11:05:10.030899 sshd[1815]: Connection closed by 147.75.109.163 port 50072 Jan 29 11:05:10.031437 sshd-session[1813]: pam_unix(sshd:session): session closed for user core Jan 29 11:05:10.036477 systemd[1]: sshd@3-78.46.186.225:22-147.75.109.163:50072.service: Deactivated successfully. Jan 29 11:05:10.039767 systemd[1]: session-4.scope: Deactivated successfully. Jan 29 11:05:10.041172 systemd-logind[1458]: Session 4 logged out. Waiting for processes to exit. Jan 29 11:05:10.042266 systemd-logind[1458]: Removed session 4. Jan 29 11:05:10.208112 systemd[1]: Started sshd@4-78.46.186.225:22-147.75.109.163:50080.service - OpenSSH per-connection server daemon (147.75.109.163:50080). Jan 29 11:05:11.188660 sshd[1820]: Accepted publickey for core from 147.75.109.163 port 50080 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:11.190967 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:11.196438 systemd-logind[1458]: New session 5 of user core. Jan 29 11:05:11.207118 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 29 11:05:11.721490 sudo[1823]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 29 11:05:11.722183 sudo[1823]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 11:05:11.745000 sudo[1823]: pam_unix(sudo:session): session closed for user root Jan 29 11:05:11.904038 sshd[1822]: Connection closed by 147.75.109.163 port 50080 Jan 29 11:05:11.905170 sshd-session[1820]: pam_unix(sshd:session): session closed for user core Jan 29 11:05:11.910337 systemd-logind[1458]: Session 5 logged out. Waiting for processes to exit. Jan 29 11:05:11.911278 systemd[1]: sshd@4-78.46.186.225:22-147.75.109.163:50080.service: Deactivated successfully. Jan 29 11:05:11.914312 systemd[1]: session-5.scope: Deactivated successfully. Jan 29 11:05:11.915337 systemd-logind[1458]: Removed session 5. Jan 29 11:05:12.092314 systemd[1]: Started sshd@5-78.46.186.225:22-147.75.109.163:50096.service - OpenSSH per-connection server daemon (147.75.109.163:50096). Jan 29 11:05:13.088280 sshd[1828]: Accepted publickey for core from 147.75.109.163 port 50096 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:13.090904 sshd-session[1828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:13.097296 systemd-logind[1458]: New session 6 of user core. Jan 29 11:05:13.102142 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 29 11:05:13.617115 sudo[1832]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 29 11:05:13.617456 sudo[1832]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 11:05:13.621537 sudo[1832]: pam_unix(sudo:session): session closed for user root Jan 29 11:05:13.627481 sudo[1831]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 29 11:05:13.627786 sudo[1831]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 11:05:13.648396 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 29 11:05:13.679286 augenrules[1854]: No rules Jan 29 11:05:13.680853 systemd[1]: audit-rules.service: Deactivated successfully. Jan 29 11:05:13.681214 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 29 11:05:13.682797 sudo[1831]: pam_unix(sudo:session): session closed for user root Jan 29 11:05:13.844932 sshd[1830]: Connection closed by 147.75.109.163 port 50096 Jan 29 11:05:13.844159 sshd-session[1828]: pam_unix(sshd:session): session closed for user core Jan 29 11:05:13.848719 systemd[1]: sshd@5-78.46.186.225:22-147.75.109.163:50096.service: Deactivated successfully. Jan 29 11:05:13.850450 systemd[1]: session-6.scope: Deactivated successfully. Jan 29 11:05:13.852255 systemd-logind[1458]: Session 6 logged out. Waiting for processes to exit. Jan 29 11:05:13.853521 systemd-logind[1458]: Removed session 6. Jan 29 11:05:14.017108 systemd[1]: Started sshd@6-78.46.186.225:22-147.75.109.163:50100.service - OpenSSH per-connection server daemon (147.75.109.163:50100). Jan 29 11:05:14.999319 sshd[1862]: Accepted publickey for core from 147.75.109.163 port 50100 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:05:15.001236 sshd-session[1862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:05:15.006466 systemd-logind[1458]: New session 7 of user core. Jan 29 11:05:15.017125 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 29 11:05:15.416492 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Jan 29 11:05:15.427537 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:15.521099 sudo[1868]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 29 11:05:15.521419 sudo[1868]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 11:05:15.548083 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:15.549650 (kubelet)[1878]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:05:15.596878 kubelet[1878]: E0129 11:05:15.593432 1878 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:05:15.596082 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:05:15.596211 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:05:15.821193 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 29 11:05:15.821361 (dockerd)[1899]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 29 11:05:16.052386 dockerd[1899]: time="2025-01-29T11:05:16.052313694Z" level=info msg="Starting up" Jan 29 11:05:16.126313 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1481772655-merged.mount: Deactivated successfully. Jan 29 11:05:16.132214 systemd[1]: var-lib-docker-metacopy\x2dcheck3246720787-merged.mount: Deactivated successfully. Jan 29 11:05:16.149540 dockerd[1899]: time="2025-01-29T11:05:16.149485754Z" level=info msg="Loading containers: start." Jan 29 11:05:16.311041 kernel: Initializing XFRM netlink socket Jan 29 11:05:16.397041 systemd-networkd[1383]: docker0: Link UP Jan 29 11:05:16.441454 dockerd[1899]: time="2025-01-29T11:05:16.441227213Z" level=info msg="Loading containers: done." Jan 29 11:05:16.456579 dockerd[1899]: time="2025-01-29T11:05:16.456512615Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 29 11:05:16.456785 dockerd[1899]: time="2025-01-29T11:05:16.456628454Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Jan 29 11:05:16.456785 dockerd[1899]: time="2025-01-29T11:05:16.456764133Z" level=info msg="Daemon has completed initialization" Jan 29 11:05:16.494259 dockerd[1899]: time="2025-01-29T11:05:16.494136181Z" level=info msg="API listen on /run/docker.sock" Jan 29 11:05:16.494355 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 29 11:05:17.592823 containerd[1478]: time="2025-01-29T11:05:17.592757195Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.9\"" Jan 29 11:05:18.160436 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount950558406.mount: Deactivated successfully. Jan 29 11:05:19.064643 containerd[1478]: time="2025-01-29T11:05:19.063831811Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:19.066225 containerd[1478]: time="2025-01-29T11:05:19.066011805Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.9: active requests=0, bytes read=29865027" Jan 29 11:05:19.067602 containerd[1478]: time="2025-01-29T11:05:19.067540429Z" level=info msg="ImageCreate event name:\"sha256:5a490fe478de4f27039cf07d124901df2a58010e72f7afe3f65c70c05ada6715\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:19.070965 containerd[1478]: time="2025-01-29T11:05:19.070911481Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:540de8f810ac963b8ed93f7393a8746d68e7e8a2c79ea58ff409ac5b9ca6a9fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:19.072724 containerd[1478]: time="2025-01-29T11:05:19.072452945Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.9\" with image id \"sha256:5a490fe478de4f27039cf07d124901df2a58010e72f7afe3f65c70c05ada6715\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:540de8f810ac963b8ed93f7393a8746d68e7e8a2c79ea58ff409ac5b9ca6a9fc\", size \"29861735\" in 1.479650429s" Jan 29 11:05:19.072724 containerd[1478]: time="2025-01-29T11:05:19.072496425Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.9\" returns image reference \"sha256:5a490fe478de4f27039cf07d124901df2a58010e72f7afe3f65c70c05ada6715\"" Jan 29 11:05:19.096267 containerd[1478]: time="2025-01-29T11:05:19.095987149Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.9\"" Jan 29 11:05:20.740427 containerd[1478]: time="2025-01-29T11:05:20.740358678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:20.742396 containerd[1478]: time="2025-01-29T11:05:20.742266466Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.9: active requests=0, bytes read=26901581" Jan 29 11:05:20.742396 containerd[1478]: time="2025-01-29T11:05:20.742321027Z" level=info msg="ImageCreate event name:\"sha256:cd43f1277f3b33fd1db15e7f98b093eb07e4d4530ff326356591daeb16369ca2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:20.745665 containerd[1478]: time="2025-01-29T11:05:20.745601516Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6350693c04956b13db2519e01ca12a0bbe58466e9f12ef8617f1429da6081f43\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:20.747840 containerd[1478]: time="2025-01-29T11:05:20.746896335Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.9\" with image id \"sha256:cd43f1277f3b33fd1db15e7f98b093eb07e4d4530ff326356591daeb16369ca2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6350693c04956b13db2519e01ca12a0bbe58466e9f12ef8617f1429da6081f43\", size \"28305351\" in 1.650864705s" Jan 29 11:05:20.747840 containerd[1478]: time="2025-01-29T11:05:20.746938056Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.9\" returns image reference \"sha256:cd43f1277f3b33fd1db15e7f98b093eb07e4d4530ff326356591daeb16369ca2\"" Jan 29 11:05:20.771294 containerd[1478]: time="2025-01-29T11:05:20.771198498Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.9\"" Jan 29 11:05:21.828357 containerd[1478]: time="2025-01-29T11:05:21.828291763Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:21.830194 containerd[1478]: time="2025-01-29T11:05:21.830138189Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.9: active requests=0, bytes read=16164358" Jan 29 11:05:21.830936 containerd[1478]: time="2025-01-29T11:05:21.830317392Z" level=info msg="ImageCreate event name:\"sha256:4ebb50f72fd1ba66a57f91b338174ab72034493ff261ebb9bbfd717d882178ce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:21.833785 containerd[1478]: time="2025-01-29T11:05:21.833709401Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:153efd6dc89e61a38ef273cf4c4cebd2bfee68082c2ee3d4fab5da94e4ae13d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:21.835024 containerd[1478]: time="2025-01-29T11:05:21.834887258Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.9\" with image id \"sha256:4ebb50f72fd1ba66a57f91b338174ab72034493ff261ebb9bbfd717d882178ce\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:153efd6dc89e61a38ef273cf4c4cebd2bfee68082c2ee3d4fab5da94e4ae13d3\", size \"17568146\" in 1.063647198s" Jan 29 11:05:21.835024 containerd[1478]: time="2025-01-29T11:05:21.834927338Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.9\" returns image reference \"sha256:4ebb50f72fd1ba66a57f91b338174ab72034493ff261ebb9bbfd717d882178ce\"" Jan 29 11:05:21.857684 containerd[1478]: time="2025-01-29T11:05:21.857645665Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.9\"" Jan 29 11:05:22.903892 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1444672798.mount: Deactivated successfully. Jan 29 11:05:23.228085 containerd[1478]: time="2025-01-29T11:05:23.227931317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:23.231056 containerd[1478]: time="2025-01-29T11:05:23.230875877Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.9: active requests=0, bytes read=25662738" Jan 29 11:05:23.235694 containerd[1478]: time="2025-01-29T11:05:23.234987572Z" level=info msg="ImageCreate event name:\"sha256:d97113839930faa5ab88f70aff4bfb62f7381074a290dd5aadbec9b16b2567a2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:23.238830 containerd[1478]: time="2025-01-29T11:05:23.237756929Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:d78dc40d97ff862fd8ddb47f80a5ba3feec17bc73e58a60e963885e33faa0083\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:23.238830 containerd[1478]: time="2025-01-29T11:05:23.238613420Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.9\" with image id \"sha256:d97113839930faa5ab88f70aff4bfb62f7381074a290dd5aadbec9b16b2567a2\", repo tag \"registry.k8s.io/kube-proxy:v1.30.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:d78dc40d97ff862fd8ddb47f80a5ba3feec17bc73e58a60e963885e33faa0083\", size \"25661731\" in 1.380928554s" Jan 29 11:05:23.238830 containerd[1478]: time="2025-01-29T11:05:23.238643421Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.9\" returns image reference \"sha256:d97113839930faa5ab88f70aff4bfb62f7381074a290dd5aadbec9b16b2567a2\"" Jan 29 11:05:23.264784 containerd[1478]: time="2025-01-29T11:05:23.264679729Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 29 11:05:23.830534 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3349610440.mount: Deactivated successfully. Jan 29 11:05:24.443766 containerd[1478]: time="2025-01-29T11:05:24.443637674Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:24.445396 containerd[1478]: time="2025-01-29T11:05:24.445091373Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485461" Jan 29 11:05:24.446306 containerd[1478]: time="2025-01-29T11:05:24.446267068Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:24.450004 containerd[1478]: time="2025-01-29T11:05:24.449941836Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:24.451266 containerd[1478]: time="2025-01-29T11:05:24.451215732Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.186456842s" Jan 29 11:05:24.451266 containerd[1478]: time="2025-01-29T11:05:24.451258013Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Jan 29 11:05:24.475467 containerd[1478]: time="2025-01-29T11:05:24.475430084Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 29 11:05:25.014486 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2124169298.mount: Deactivated successfully. Jan 29 11:05:25.023798 containerd[1478]: time="2025-01-29T11:05:25.023621495Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:25.026225 containerd[1478]: time="2025-01-29T11:05:25.026139366Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268841" Jan 29 11:05:25.027492 containerd[1478]: time="2025-01-29T11:05:25.027411622Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:25.030169 containerd[1478]: time="2025-01-29T11:05:25.030095535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:25.031228 containerd[1478]: time="2025-01-29T11:05:25.030852824Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 555.38258ms" Jan 29 11:05:25.031228 containerd[1478]: time="2025-01-29T11:05:25.030892865Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Jan 29 11:05:25.053711 containerd[1478]: time="2025-01-29T11:05:25.053626467Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Jan 29 11:05:25.614066 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Jan 29 11:05:25.623661 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:25.627665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2127399531.mount: Deactivated successfully. Jan 29 11:05:25.757011 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:25.768477 (kubelet)[2255]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 11:05:25.828254 kubelet[2255]: E0129 11:05:25.828207 2255 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 11:05:25.831421 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 11:05:25.831576 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 11:05:27.135771 containerd[1478]: time="2025-01-29T11:05:27.135644004Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:27.137596 containerd[1478]: time="2025-01-29T11:05:27.137519705Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191552" Jan 29 11:05:27.138678 containerd[1478]: time="2025-01-29T11:05:27.138603958Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:27.144569 containerd[1478]: time="2025-01-29T11:05:27.144487386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:27.147000 containerd[1478]: time="2025-01-29T11:05:27.146850093Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 2.092941543s" Jan 29 11:05:27.147000 containerd[1478]: time="2025-01-29T11:05:27.146893333Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Jan 29 11:05:32.423841 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:32.432356 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:32.455661 systemd[1]: Reloading requested from client PID 2364 ('systemctl') (unit session-7.scope)... Jan 29 11:05:32.455681 systemd[1]: Reloading... Jan 29 11:05:32.581850 zram_generator::config[2407]: No configuration found. Jan 29 11:05:32.680464 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 11:05:32.746864 systemd[1]: Reloading finished in 290 ms. Jan 29 11:05:32.810576 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 29 11:05:32.810734 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 29 11:05:32.811225 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:32.823314 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:32.968192 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:32.968280 (kubelet)[2453]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 29 11:05:33.018684 kubelet[2453]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 11:05:33.018684 kubelet[2453]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 29 11:05:33.018684 kubelet[2453]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 11:05:33.019160 kubelet[2453]: I0129 11:05:33.018921 2453 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 29 11:05:34.023898 kubelet[2453]: I0129 11:05:34.023556 2453 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 29 11:05:34.023898 kubelet[2453]: I0129 11:05:34.023595 2453 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 29 11:05:34.024425 kubelet[2453]: I0129 11:05:34.023925 2453 server.go:927] "Client rotation is on, will bootstrap in background" Jan 29 11:05:34.046524 kubelet[2453]: E0129 11:05:34.046256 2453 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://78.46.186.225:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.046524 kubelet[2453]: I0129 11:05:34.046409 2453 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 29 11:05:34.055937 kubelet[2453]: I0129 11:05:34.055892 2453 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 29 11:05:34.057834 kubelet[2453]: I0129 11:05:34.057692 2453 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 29 11:05:34.058014 kubelet[2453]: I0129 11:05:34.057746 2453 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4152-2-0-b-6e231d00a9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 29 11:05:34.058118 kubelet[2453]: I0129 11:05:34.058045 2453 topology_manager.go:138] "Creating topology manager with none policy" Jan 29 11:05:34.058118 kubelet[2453]: I0129 11:05:34.058056 2453 container_manager_linux.go:301] "Creating device plugin manager" Jan 29 11:05:34.058338 kubelet[2453]: I0129 11:05:34.058308 2453 state_mem.go:36] "Initialized new in-memory state store" Jan 29 11:05:34.061121 kubelet[2453]: I0129 11:05:34.060798 2453 kubelet.go:400] "Attempting to sync node with API server" Jan 29 11:05:34.061121 kubelet[2453]: I0129 11:05:34.060926 2453 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 29 11:05:34.061121 kubelet[2453]: W0129 11:05:34.060996 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://78.46.186.225:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152-2-0-b-6e231d00a9&limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.061121 kubelet[2453]: E0129 11:05:34.061086 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://78.46.186.225:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152-2-0-b-6e231d00a9&limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.061121 kubelet[2453]: I0129 11:05:34.061120 2453 kubelet.go:312] "Adding apiserver pod source" Jan 29 11:05:34.061432 kubelet[2453]: I0129 11:05:34.061175 2453 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 29 11:05:34.063270 kubelet[2453]: I0129 11:05:34.063187 2453 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 29 11:05:34.063919 kubelet[2453]: I0129 11:05:34.063879 2453 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 29 11:05:34.064046 kubelet[2453]: W0129 11:05:34.064014 2453 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 29 11:05:34.065358 kubelet[2453]: I0129 11:05:34.065321 2453 server.go:1264] "Started kubelet" Jan 29 11:05:34.065610 kubelet[2453]: W0129 11:05:34.065541 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://78.46.186.225:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.065652 kubelet[2453]: E0129 11:05:34.065629 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://78.46.186.225:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.070019 kubelet[2453]: I0129 11:05:34.069945 2453 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 29 11:05:34.073978 kubelet[2453]: I0129 11:05:34.073905 2453 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 29 11:05:34.074125 kubelet[2453]: E0129 11:05:34.073740 2453 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://78.46.186.225:6443/api/v1/namespaces/default/events\": dial tcp 78.46.186.225:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4152-2-0-b-6e231d00a9.181f251084c1a0a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152-2-0-b-6e231d00a9,UID:ci-4152-2-0-b-6e231d00a9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152-2-0-b-6e231d00a9,},FirstTimestamp:2025-01-29 11:05:34.065287331 +0000 UTC m=+1.089686570,LastTimestamp:2025-01-29 11:05:34.065287331 +0000 UTC m=+1.089686570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152-2-0-b-6e231d00a9,}" Jan 29 11:05:34.074292 kubelet[2453]: I0129 11:05:34.074264 2453 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 29 11:05:34.075211 kubelet[2453]: I0129 11:05:34.075171 2453 server.go:455] "Adding debug handlers to kubelet server" Jan 29 11:05:34.077034 kubelet[2453]: I0129 11:05:34.077005 2453 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 29 11:05:34.084303 kubelet[2453]: W0129 11:05:34.083942 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://78.46.186.225:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.084303 kubelet[2453]: E0129 11:05:34.083993 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://78.46.186.225:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.084303 kubelet[2453]: E0129 11:05:34.084042 2453 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.186.225:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152-2-0-b-6e231d00a9?timeout=10s\": dial tcp 78.46.186.225:6443: connect: connection refused" interval="200ms" Jan 29 11:05:34.084303 kubelet[2453]: I0129 11:05:34.084135 2453 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 29 11:05:34.084303 kubelet[2453]: I0129 11:05:34.084149 2453 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 29 11:05:34.086197 kubelet[2453]: I0129 11:05:34.085560 2453 reconciler.go:26] "Reconciler: start to sync state" Jan 29 11:05:34.086197 kubelet[2453]: I0129 11:05:34.085873 2453 factory.go:221] Registration of the systemd container factory successfully Jan 29 11:05:34.086197 kubelet[2453]: I0129 11:05:34.085966 2453 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 29 11:05:34.087912 kubelet[2453]: I0129 11:05:34.087776 2453 factory.go:221] Registration of the containerd container factory successfully Jan 29 11:05:34.097228 kubelet[2453]: I0129 11:05:34.097082 2453 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 29 11:05:34.098178 kubelet[2453]: I0129 11:05:34.098157 2453 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 29 11:05:34.098662 kubelet[2453]: I0129 11:05:34.098272 2453 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 29 11:05:34.098662 kubelet[2453]: I0129 11:05:34.098301 2453 kubelet.go:2337] "Starting kubelet main sync loop" Jan 29 11:05:34.098662 kubelet[2453]: E0129 11:05:34.098343 2453 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 29 11:05:34.106149 kubelet[2453]: W0129 11:05:34.106060 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://78.46.186.225:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.106149 kubelet[2453]: E0129 11:05:34.106119 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://78.46.186.225:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:34.106301 kubelet[2453]: E0129 11:05:34.106239 2453 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 29 11:05:34.116048 kubelet[2453]: I0129 11:05:34.115884 2453 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 29 11:05:34.116048 kubelet[2453]: I0129 11:05:34.115902 2453 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 29 11:05:34.116048 kubelet[2453]: I0129 11:05:34.115923 2453 state_mem.go:36] "Initialized new in-memory state store" Jan 29 11:05:34.118617 kubelet[2453]: I0129 11:05:34.118538 2453 policy_none.go:49] "None policy: Start" Jan 29 11:05:34.119560 kubelet[2453]: I0129 11:05:34.119540 2453 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 29 11:05:34.119645 kubelet[2453]: I0129 11:05:34.119569 2453 state_mem.go:35] "Initializing new in-memory state store" Jan 29 11:05:34.126988 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 29 11:05:34.143464 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 29 11:05:34.147588 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 29 11:05:34.156923 kubelet[2453]: I0129 11:05:34.156878 2453 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 29 11:05:34.157977 kubelet[2453]: I0129 11:05:34.157433 2453 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 29 11:05:34.157977 kubelet[2453]: I0129 11:05:34.157619 2453 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 29 11:05:34.161188 kubelet[2453]: E0129 11:05:34.160687 2453 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4152-2-0-b-6e231d00a9\" not found" Jan 29 11:05:34.186603 kubelet[2453]: I0129 11:05:34.186509 2453 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.187152 kubelet[2453]: E0129 11:05:34.187118 2453 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://78.46.186.225:6443/api/v1/nodes\": dial tcp 78.46.186.225:6443: connect: connection refused" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.199322 kubelet[2453]: I0129 11:05:34.199250 2453 topology_manager.go:215] "Topology Admit Handler" podUID="31f5e2f665aa44c47678d1c5dcb56330" podNamespace="kube-system" podName="kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.202436 kubelet[2453]: I0129 11:05:34.202288 2453 topology_manager.go:215] "Topology Admit Handler" podUID="b1d1b206fa4fded7a9d943911580657c" podNamespace="kube-system" podName="kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.204786 kubelet[2453]: I0129 11:05:34.204408 2453 topology_manager.go:215] "Topology Admit Handler" podUID="2f4dc54942d4fda2da45d30d85407f19" podNamespace="kube-system" podName="kube-scheduler-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.211728 systemd[1]: Created slice kubepods-burstable-pod31f5e2f665aa44c47678d1c5dcb56330.slice - libcontainer container kubepods-burstable-pod31f5e2f665aa44c47678d1c5dcb56330.slice. Jan 29 11:05:34.231331 systemd[1]: Created slice kubepods-burstable-podb1d1b206fa4fded7a9d943911580657c.slice - libcontainer container kubepods-burstable-podb1d1b206fa4fded7a9d943911580657c.slice. Jan 29 11:05:34.246489 systemd[1]: Created slice kubepods-burstable-pod2f4dc54942d4fda2da45d30d85407f19.slice - libcontainer container kubepods-burstable-pod2f4dc54942d4fda2da45d30d85407f19.slice. Jan 29 11:05:34.285092 kubelet[2453]: E0129 11:05:34.284685 2453 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.186.225:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152-2-0-b-6e231d00a9?timeout=10s\": dial tcp 78.46.186.225:6443: connect: connection refused" interval="400ms" Jan 29 11:05:34.286543 kubelet[2453]: I0129 11:05:34.286019 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/31f5e2f665aa44c47678d1c5dcb56330-ca-certs\") pod \"kube-apiserver-ci-4152-2-0-b-6e231d00a9\" (UID: \"31f5e2f665aa44c47678d1c5dcb56330\") " pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.286543 kubelet[2453]: I0129 11:05:34.286069 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/31f5e2f665aa44c47678d1c5dcb56330-k8s-certs\") pod \"kube-apiserver-ci-4152-2-0-b-6e231d00a9\" (UID: \"31f5e2f665aa44c47678d1c5dcb56330\") " pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.286543 kubelet[2453]: I0129 11:05:34.286107 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-k8s-certs\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.286543 kubelet[2453]: I0129 11:05:34.286136 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-kubeconfig\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.286543 kubelet[2453]: I0129 11:05:34.286161 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/31f5e2f665aa44c47678d1c5dcb56330-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4152-2-0-b-6e231d00a9\" (UID: \"31f5e2f665aa44c47678d1c5dcb56330\") " pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.287025 kubelet[2453]: I0129 11:05:34.286186 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-ca-certs\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.287025 kubelet[2453]: I0129 11:05:34.286208 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-flexvolume-dir\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.287025 kubelet[2453]: I0129 11:05:34.286232 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.287025 kubelet[2453]: I0129 11:05:34.286267 2453 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2f4dc54942d4fda2da45d30d85407f19-kubeconfig\") pod \"kube-scheduler-ci-4152-2-0-b-6e231d00a9\" (UID: \"2f4dc54942d4fda2da45d30d85407f19\") " pod="kube-system/kube-scheduler-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.390670 kubelet[2453]: I0129 11:05:34.390600 2453 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.391267 kubelet[2453]: E0129 11:05:34.391160 2453 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://78.46.186.225:6443/api/v1/nodes\": dial tcp 78.46.186.225:6443: connect: connection refused" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.529782 containerd[1478]: time="2025-01-29T11:05:34.529696364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4152-2-0-b-6e231d00a9,Uid:31f5e2f665aa44c47678d1c5dcb56330,Namespace:kube-system,Attempt:0,}" Jan 29 11:05:34.543947 containerd[1478]: time="2025-01-29T11:05:34.543393284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4152-2-0-b-6e231d00a9,Uid:b1d1b206fa4fded7a9d943911580657c,Namespace:kube-system,Attempt:0,}" Jan 29 11:05:34.550037 containerd[1478]: time="2025-01-29T11:05:34.549988742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4152-2-0-b-6e231d00a9,Uid:2f4dc54942d4fda2da45d30d85407f19,Namespace:kube-system,Attempt:0,}" Jan 29 11:05:34.685292 kubelet[2453]: E0129 11:05:34.685229 2453 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.186.225:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152-2-0-b-6e231d00a9?timeout=10s\": dial tcp 78.46.186.225:6443: connect: connection refused" interval="800ms" Jan 29 11:05:34.794743 kubelet[2453]: I0129 11:05:34.794589 2453 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:34.795475 kubelet[2453]: E0129 11:05:34.795440 2453 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://78.46.186.225:6443/api/v1/nodes\": dial tcp 78.46.186.225:6443: connect: connection refused" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:35.049594 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3998594082.mount: Deactivated successfully. Jan 29 11:05:35.056716 containerd[1478]: time="2025-01-29T11:05:35.056626688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 11:05:35.058938 containerd[1478]: time="2025-01-29T11:05:35.058850706Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Jan 29 11:05:35.062154 containerd[1478]: time="2025-01-29T11:05:35.062081294Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 11:05:35.064070 containerd[1478]: time="2025-01-29T11:05:35.063865949Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 29 11:05:35.065690 containerd[1478]: time="2025-01-29T11:05:35.065636444Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 11:05:35.066802 containerd[1478]: time="2025-01-29T11:05:35.066748093Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 11:05:35.067603 containerd[1478]: time="2025-01-29T11:05:35.067569260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 11:05:35.068530 containerd[1478]: time="2025-01-29T11:05:35.068459547Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 29 11:05:35.069240 containerd[1478]: time="2025-01-29T11:05:35.069205914Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 519.132452ms" Jan 29 11:05:35.073325 containerd[1478]: time="2025-01-29T11:05:35.073164907Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 543.261422ms" Jan 29 11:05:35.075355 containerd[1478]: time="2025-01-29T11:05:35.073921473Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 530.416548ms" Jan 29 11:05:35.114924 kubelet[2453]: W0129 11:05:35.113476 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://78.46.186.225:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:35.114924 kubelet[2453]: E0129 11:05:35.113553 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://78.46.186.225:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:35.203014 containerd[1478]: time="2025-01-29T11:05:35.201880392Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:05:35.203014 containerd[1478]: time="2025-01-29T11:05:35.201978992Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:05:35.203014 containerd[1478]: time="2025-01-29T11:05:35.201995753Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:35.203014 containerd[1478]: time="2025-01-29T11:05:35.202177274Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:35.207354 containerd[1478]: time="2025-01-29T11:05:35.207177276Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:05:35.207751 containerd[1478]: time="2025-01-29T11:05:35.207701481Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:05:35.208034 containerd[1478]: time="2025-01-29T11:05:35.207898162Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:35.208362 containerd[1478]: time="2025-01-29T11:05:35.208302726Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:35.209068 containerd[1478]: time="2025-01-29T11:05:35.208575288Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:05:35.209068 containerd[1478]: time="2025-01-29T11:05:35.208923691Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:05:35.209068 containerd[1478]: time="2025-01-29T11:05:35.208953611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:35.211995 containerd[1478]: time="2025-01-29T11:05:35.211827075Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:35.233067 systemd[1]: Started cri-containerd-2fbf92f363bb6e201d25fab72dd5936e3649147c837a7dcfe773a5ebac655f0b.scope - libcontainer container 2fbf92f363bb6e201d25fab72dd5936e3649147c837a7dcfe773a5ebac655f0b. Jan 29 11:05:35.238021 systemd[1]: Started cri-containerd-6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba.scope - libcontainer container 6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba. Jan 29 11:05:35.249346 systemd[1]: Started cri-containerd-a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572.scope - libcontainer container a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572. Jan 29 11:05:35.284357 containerd[1478]: time="2025-01-29T11:05:35.283915203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4152-2-0-b-6e231d00a9,Uid:31f5e2f665aa44c47678d1c5dcb56330,Namespace:kube-system,Attempt:0,} returns sandbox id \"2fbf92f363bb6e201d25fab72dd5936e3649147c837a7dcfe773a5ebac655f0b\"" Jan 29 11:05:35.293974 containerd[1478]: time="2025-01-29T11:05:35.293911527Z" level=info msg="CreateContainer within sandbox \"2fbf92f363bb6e201d25fab72dd5936e3649147c837a7dcfe773a5ebac655f0b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 29 11:05:35.307730 kubelet[2453]: W0129 11:05:35.306476 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://78.46.186.225:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152-2-0-b-6e231d00a9&limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:35.307730 kubelet[2453]: E0129 11:05:35.306559 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://78.46.186.225:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4152-2-0-b-6e231d00a9&limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:35.313267 kubelet[2453]: W0129 11:05:35.313203 2453 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://78.46.186.225:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:35.313618 kubelet[2453]: E0129 11:05:35.313507 2453 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://78.46.186.225:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 78.46.186.225:6443: connect: connection refused Jan 29 11:05:35.318667 containerd[1478]: time="2025-01-29T11:05:35.318625935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4152-2-0-b-6e231d00a9,Uid:b1d1b206fa4fded7a9d943911580657c,Namespace:kube-system,Attempt:0,} returns sandbox id \"a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572\"" Jan 29 11:05:35.321828 containerd[1478]: time="2025-01-29T11:05:35.321760082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4152-2-0-b-6e231d00a9,Uid:2f4dc54942d4fda2da45d30d85407f19,Namespace:kube-system,Attempt:0,} returns sandbox id \"6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba\"" Jan 29 11:05:35.325197 containerd[1478]: time="2025-01-29T11:05:35.324835468Z" level=info msg="CreateContainer within sandbox \"a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 29 11:05:35.327612 containerd[1478]: time="2025-01-29T11:05:35.327572731Z" level=info msg="CreateContainer within sandbox \"6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 29 11:05:35.332338 containerd[1478]: time="2025-01-29T11:05:35.332290170Z" level=info msg="CreateContainer within sandbox \"2fbf92f363bb6e201d25fab72dd5936e3649147c837a7dcfe773a5ebac655f0b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"235c9e3e3882d3c3a2ba35a7a87e4aa3c7493cf1d360e7172201b4a3c2090b6d\"" Jan 29 11:05:35.334744 containerd[1478]: time="2025-01-29T11:05:35.333442700Z" level=info msg="StartContainer for \"235c9e3e3882d3c3a2ba35a7a87e4aa3c7493cf1d360e7172201b4a3c2090b6d\"" Jan 29 11:05:35.356299 containerd[1478]: time="2025-01-29T11:05:35.356251812Z" level=info msg="CreateContainer within sandbox \"a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b\"" Jan 29 11:05:35.357869 containerd[1478]: time="2025-01-29T11:05:35.357755505Z" level=info msg="CreateContainer within sandbox \"6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4\"" Jan 29 11:05:35.358249 containerd[1478]: time="2025-01-29T11:05:35.358016187Z" level=info msg="StartContainer for \"474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b\"" Jan 29 11:05:35.359464 containerd[1478]: time="2025-01-29T11:05:35.359441039Z" level=info msg="StartContainer for \"1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4\"" Jan 29 11:05:35.371036 systemd[1]: Started cri-containerd-235c9e3e3882d3c3a2ba35a7a87e4aa3c7493cf1d360e7172201b4a3c2090b6d.scope - libcontainer container 235c9e3e3882d3c3a2ba35a7a87e4aa3c7493cf1d360e7172201b4a3c2090b6d. Jan 29 11:05:35.398009 systemd[1]: Started cri-containerd-1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4.scope - libcontainer container 1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4. Jan 29 11:05:35.414293 systemd[1]: Started cri-containerd-474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b.scope - libcontainer container 474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b. Jan 29 11:05:35.455751 containerd[1478]: time="2025-01-29T11:05:35.455672010Z" level=info msg="StartContainer for \"235c9e3e3882d3c3a2ba35a7a87e4aa3c7493cf1d360e7172201b4a3c2090b6d\" returns successfully" Jan 29 11:05:35.486970 kubelet[2453]: E0129 11:05:35.486781 2453 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.186.225:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4152-2-0-b-6e231d00a9?timeout=10s\": dial tcp 78.46.186.225:6443: connect: connection refused" interval="1.6s" Jan 29 11:05:35.490738 containerd[1478]: time="2025-01-29T11:05:35.490511223Z" level=info msg="StartContainer for \"1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4\" returns successfully" Jan 29 11:05:35.496507 containerd[1478]: time="2025-01-29T11:05:35.496111991Z" level=info msg="StartContainer for \"474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b\" returns successfully" Jan 29 11:05:35.598057 kubelet[2453]: I0129 11:05:35.597870 2453 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:37.600063 kubelet[2453]: E0129 11:05:37.600021 2453 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4152-2-0-b-6e231d00a9\" not found" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:37.695645 kubelet[2453]: E0129 11:05:37.695369 2453 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4152-2-0-b-6e231d00a9.181f251084c1a0a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4152-2-0-b-6e231d00a9,UID:ci-4152-2-0-b-6e231d00a9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4152-2-0-b-6e231d00a9,},FirstTimestamp:2025-01-29 11:05:34.065287331 +0000 UTC m=+1.089686570,LastTimestamp:2025-01-29 11:05:34.065287331 +0000 UTC m=+1.089686570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152-2-0-b-6e231d00a9,}" Jan 29 11:05:37.734909 kubelet[2453]: I0129 11:05:37.734860 2453 kubelet_node_status.go:76] "Successfully registered node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:38.063973 kubelet[2453]: I0129 11:05:38.063945 2453 apiserver.go:52] "Watching apiserver" Jan 29 11:05:38.084759 kubelet[2453]: I0129 11:05:38.084690 2453 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 29 11:05:39.922573 systemd[1]: Reloading requested from client PID 2730 ('systemctl') (unit session-7.scope)... Jan 29 11:05:39.922596 systemd[1]: Reloading... Jan 29 11:05:40.090863 zram_generator::config[2773]: No configuration found. Jan 29 11:05:40.238240 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 11:05:40.320710 systemd[1]: Reloading finished in 397 ms. Jan 29 11:05:40.368987 kubelet[2453]: I0129 11:05:40.368854 2453 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 29 11:05:40.369472 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:40.383715 systemd[1]: kubelet.service: Deactivated successfully. Jan 29 11:05:40.384206 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:40.384290 systemd[1]: kubelet.service: Consumed 1.528s CPU time, 111.5M memory peak, 0B memory swap peak. Jan 29 11:05:40.394461 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 11:05:40.538125 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 11:05:40.546320 (kubelet)[2815]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 29 11:05:40.607263 kubelet[2815]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 11:05:40.608268 kubelet[2815]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 29 11:05:40.608268 kubelet[2815]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 11:05:40.608268 kubelet[2815]: I0129 11:05:40.607401 2815 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 29 11:05:40.612874 kubelet[2815]: I0129 11:05:40.612841 2815 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 29 11:05:40.613030 kubelet[2815]: I0129 11:05:40.613019 2815 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 29 11:05:40.613360 kubelet[2815]: I0129 11:05:40.613276 2815 server.go:927] "Client rotation is on, will bootstrap in background" Jan 29 11:05:40.615855 kubelet[2815]: I0129 11:05:40.615572 2815 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 29 11:05:40.619865 kubelet[2815]: I0129 11:05:40.617805 2815 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 29 11:05:40.631649 kubelet[2815]: I0129 11:05:40.631574 2815 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 29 11:05:40.631997 kubelet[2815]: I0129 11:05:40.631930 2815 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 29 11:05:40.632384 kubelet[2815]: I0129 11:05:40.631985 2815 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4152-2-0-b-6e231d00a9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 29 11:05:40.632384 kubelet[2815]: I0129 11:05:40.632249 2815 topology_manager.go:138] "Creating topology manager with none policy" Jan 29 11:05:40.632384 kubelet[2815]: I0129 11:05:40.632259 2815 container_manager_linux.go:301] "Creating device plugin manager" Jan 29 11:05:40.632384 kubelet[2815]: I0129 11:05:40.632304 2815 state_mem.go:36] "Initialized new in-memory state store" Jan 29 11:05:40.632658 kubelet[2815]: I0129 11:05:40.632445 2815 kubelet.go:400] "Attempting to sync node with API server" Jan 29 11:05:40.632658 kubelet[2815]: I0129 11:05:40.632462 2815 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 29 11:05:40.633256 kubelet[2815]: I0129 11:05:40.633058 2815 kubelet.go:312] "Adding apiserver pod source" Jan 29 11:05:40.633256 kubelet[2815]: I0129 11:05:40.633100 2815 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 29 11:05:40.641803 kubelet[2815]: I0129 11:05:40.637854 2815 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 29 11:05:40.641803 kubelet[2815]: I0129 11:05:40.638079 2815 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 29 11:05:40.641803 kubelet[2815]: I0129 11:05:40.638559 2815 server.go:1264] "Started kubelet" Jan 29 11:05:40.641803 kubelet[2815]: I0129 11:05:40.641055 2815 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 29 11:05:40.644831 kubelet[2815]: I0129 11:05:40.644169 2815 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 29 11:05:40.648650 kubelet[2815]: I0129 11:05:40.648611 2815 server.go:455] "Adding debug handlers to kubelet server" Jan 29 11:05:40.650031 kubelet[2815]: I0129 11:05:40.649990 2815 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 29 11:05:40.650224 kubelet[2815]: I0129 11:05:40.650005 2815 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 29 11:05:40.650697 kubelet[2815]: I0129 11:05:40.650680 2815 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 29 11:05:40.652271 kubelet[2815]: I0129 11:05:40.652233 2815 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 29 11:05:40.652417 kubelet[2815]: I0129 11:05:40.652397 2815 reconciler.go:26] "Reconciler: start to sync state" Jan 29 11:05:40.658709 kubelet[2815]: I0129 11:05:40.655503 2815 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 29 11:05:40.658709 kubelet[2815]: I0129 11:05:40.657699 2815 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 29 11:05:40.658709 kubelet[2815]: I0129 11:05:40.657741 2815 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 29 11:05:40.658709 kubelet[2815]: I0129 11:05:40.657764 2815 kubelet.go:2337] "Starting kubelet main sync loop" Jan 29 11:05:40.658709 kubelet[2815]: E0129 11:05:40.658040 2815 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 29 11:05:40.681842 kubelet[2815]: I0129 11:05:40.679550 2815 factory.go:221] Registration of the containerd container factory successfully Jan 29 11:05:40.681842 kubelet[2815]: I0129 11:05:40.679586 2815 factory.go:221] Registration of the systemd container factory successfully Jan 29 11:05:40.681842 kubelet[2815]: I0129 11:05:40.679697 2815 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 29 11:05:40.754408 kubelet[2815]: I0129 11:05:40.754343 2815 kubelet_node_status.go:73] "Attempting to register node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:40.755036 kubelet[2815]: I0129 11:05:40.754828 2815 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 29 11:05:40.755036 kubelet[2815]: I0129 11:05:40.754856 2815 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 29 11:05:40.755036 kubelet[2815]: I0129 11:05:40.755168 2815 state_mem.go:36] "Initialized new in-memory state store" Jan 29 11:05:40.755036 kubelet[2815]: I0129 11:05:40.755502 2815 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 29 11:05:40.755036 kubelet[2815]: I0129 11:05:40.755523 2815 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 29 11:05:40.755036 kubelet[2815]: I0129 11:05:40.755551 2815 policy_none.go:49] "None policy: Start" Jan 29 11:05:40.758707 kubelet[2815]: I0129 11:05:40.757294 2815 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 29 11:05:40.758707 kubelet[2815]: I0129 11:05:40.757336 2815 state_mem.go:35] "Initializing new in-memory state store" Jan 29 11:05:40.758707 kubelet[2815]: I0129 11:05:40.757583 2815 state_mem.go:75] "Updated machine memory state" Jan 29 11:05:40.758707 kubelet[2815]: E0129 11:05:40.758614 2815 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 29 11:05:40.764757 kubelet[2815]: I0129 11:05:40.764298 2815 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 29 11:05:40.764757 kubelet[2815]: I0129 11:05:40.764493 2815 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 29 11:05:40.764757 kubelet[2815]: I0129 11:05:40.764596 2815 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 29 11:05:40.779053 kubelet[2815]: I0129 11:05:40.778645 2815 kubelet_node_status.go:112] "Node was previously registered" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:40.779053 kubelet[2815]: I0129 11:05:40.778764 2815 kubelet_node_status.go:76] "Successfully registered node" node="ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:40.959834 kubelet[2815]: I0129 11:05:40.959641 2815 topology_manager.go:215] "Topology Admit Handler" podUID="31f5e2f665aa44c47678d1c5dcb56330" podNamespace="kube-system" podName="kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:40.959834 kubelet[2815]: I0129 11:05:40.959810 2815 topology_manager.go:215] "Topology Admit Handler" podUID="b1d1b206fa4fded7a9d943911580657c" podNamespace="kube-system" podName="kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:40.960046 kubelet[2815]: I0129 11:05:40.959886 2815 topology_manager.go:215] "Topology Admit Handler" podUID="2f4dc54942d4fda2da45d30d85407f19" podNamespace="kube-system" podName="kube-scheduler-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055372 kubelet[2815]: I0129 11:05:41.054668 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/31f5e2f665aa44c47678d1c5dcb56330-k8s-certs\") pod \"kube-apiserver-ci-4152-2-0-b-6e231d00a9\" (UID: \"31f5e2f665aa44c47678d1c5dcb56330\") " pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055372 kubelet[2815]: I0129 11:05:41.054756 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-ca-certs\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055372 kubelet[2815]: I0129 11:05:41.054845 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-kubeconfig\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055372 kubelet[2815]: I0129 11:05:41.054888 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2f4dc54942d4fda2da45d30d85407f19-kubeconfig\") pod \"kube-scheduler-ci-4152-2-0-b-6e231d00a9\" (UID: \"2f4dc54942d4fda2da45d30d85407f19\") " pod="kube-system/kube-scheduler-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055372 kubelet[2815]: I0129 11:05:41.054937 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/31f5e2f665aa44c47678d1c5dcb56330-ca-certs\") pod \"kube-apiserver-ci-4152-2-0-b-6e231d00a9\" (UID: \"31f5e2f665aa44c47678d1c5dcb56330\") " pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055621 kubelet[2815]: I0129 11:05:41.054996 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/31f5e2f665aa44c47678d1c5dcb56330-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4152-2-0-b-6e231d00a9\" (UID: \"31f5e2f665aa44c47678d1c5dcb56330\") " pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055621 kubelet[2815]: I0129 11:05:41.055035 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-flexvolume-dir\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055621 kubelet[2815]: I0129 11:05:41.055072 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-k8s-certs\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.055621 kubelet[2815]: I0129 11:05:41.055107 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b1d1b206fa4fded7a9d943911580657c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4152-2-0-b-6e231d00a9\" (UID: \"b1d1b206fa4fded7a9d943911580657c\") " pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" Jan 29 11:05:41.638365 kubelet[2815]: I0129 11:05:41.638306 2815 apiserver.go:52] "Watching apiserver" Jan 29 11:05:41.653848 kubelet[2815]: I0129 11:05:41.653399 2815 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 29 11:05:41.846004 kubelet[2815]: I0129 11:05:41.845918 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4152-2-0-b-6e231d00a9" podStartSLOduration=1.845870414 podStartE2EDuration="1.845870414s" podCreationTimestamp="2025-01-29 11:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 11:05:41.811050426 +0000 UTC m=+1.259406532" watchObservedRunningTime="2025-01-29 11:05:41.845870414 +0000 UTC m=+1.294226520" Jan 29 11:05:41.874246 kubelet[2815]: I0129 11:05:41.874107 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4152-2-0-b-6e231d00a9" podStartSLOduration=1.8740848799999998 podStartE2EDuration="1.87408488s" podCreationTimestamp="2025-01-29 11:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 11:05:41.846558459 +0000 UTC m=+1.294914565" watchObservedRunningTime="2025-01-29 11:05:41.87408488 +0000 UTC m=+1.322440986" Jan 29 11:05:41.900330 kubelet[2815]: I0129 11:05:41.899707 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4152-2-0-b-6e231d00a9" podStartSLOduration=1.8996811679999999 podStartE2EDuration="1.899681168s" podCreationTimestamp="2025-01-29 11:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 11:05:41.874766165 +0000 UTC m=+1.323122271" watchObservedRunningTime="2025-01-29 11:05:41.899681168 +0000 UTC m=+1.348037314" Jan 29 11:05:44.984231 systemd[1]: Started sshd@7-78.46.186.225:22-195.178.110.65:59442.service - OpenSSH per-connection server daemon (195.178.110.65:59442). Jan 29 11:05:45.022947 sshd[2878]: Connection closed by 195.178.110.65 port 59442 Jan 29 11:05:45.024114 systemd[1]: sshd@7-78.46.186.225:22-195.178.110.65:59442.service: Deactivated successfully. Jan 29 11:05:45.966845 sudo[1868]: pam_unix(sudo:session): session closed for user root Jan 29 11:05:46.126030 sshd[1864]: Connection closed by 147.75.109.163 port 50100 Jan 29 11:05:46.127280 sshd-session[1862]: pam_unix(sshd:session): session closed for user core Jan 29 11:05:46.137005 systemd[1]: sshd@6-78.46.186.225:22-147.75.109.163:50100.service: Deactivated successfully. Jan 29 11:05:46.142112 systemd[1]: session-7.scope: Deactivated successfully. Jan 29 11:05:46.142508 systemd[1]: session-7.scope: Consumed 7.187s CPU time, 189.6M memory peak, 0B memory swap peak. Jan 29 11:05:46.148025 systemd-logind[1458]: Session 7 logged out. Waiting for processes to exit. Jan 29 11:05:46.150951 systemd-logind[1458]: Removed session 7. Jan 29 11:05:53.678719 kubelet[2815]: I0129 11:05:53.678643 2815 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 29 11:05:53.680092 containerd[1478]: time="2025-01-29T11:05:53.679608577Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 29 11:05:53.680541 kubelet[2815]: I0129 11:05:53.679830 2815 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 29 11:05:54.581052 kubelet[2815]: I0129 11:05:54.581002 2815 topology_manager.go:215] "Topology Admit Handler" podUID="641c3348-e6dc-4855-869d-3befd4a68658" podNamespace="kube-system" podName="kube-proxy-c8cf6" Jan 29 11:05:54.592476 systemd[1]: Created slice kubepods-besteffort-pod641c3348_e6dc_4855_869d_3befd4a68658.slice - libcontainer container kubepods-besteffort-pod641c3348_e6dc_4855_869d_3befd4a68658.slice. Jan 29 11:05:54.741797 kubelet[2815]: I0129 11:05:54.741708 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/641c3348-e6dc-4855-869d-3befd4a68658-kube-proxy\") pod \"kube-proxy-c8cf6\" (UID: \"641c3348-e6dc-4855-869d-3befd4a68658\") " pod="kube-system/kube-proxy-c8cf6" Jan 29 11:05:54.741797 kubelet[2815]: I0129 11:05:54.741790 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/641c3348-e6dc-4855-869d-3befd4a68658-lib-modules\") pod \"kube-proxy-c8cf6\" (UID: \"641c3348-e6dc-4855-869d-3befd4a68658\") " pod="kube-system/kube-proxy-c8cf6" Jan 29 11:05:54.742458 kubelet[2815]: I0129 11:05:54.741854 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/641c3348-e6dc-4855-869d-3befd4a68658-xtables-lock\") pod \"kube-proxy-c8cf6\" (UID: \"641c3348-e6dc-4855-869d-3befd4a68658\") " pod="kube-system/kube-proxy-c8cf6" Jan 29 11:05:54.742458 kubelet[2815]: I0129 11:05:54.741884 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxlt\" (UniqueName: \"kubernetes.io/projected/641c3348-e6dc-4855-869d-3befd4a68658-kube-api-access-stxlt\") pod \"kube-proxy-c8cf6\" (UID: \"641c3348-e6dc-4855-869d-3befd4a68658\") " pod="kube-system/kube-proxy-c8cf6" Jan 29 11:05:54.821420 kubelet[2815]: I0129 11:05:54.821368 2815 topology_manager.go:215] "Topology Admit Handler" podUID="56b4a1f1-114f-4794-b7f2-e9383c60e33b" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-kdjdj" Jan 29 11:05:54.830876 systemd[1]: Created slice kubepods-besteffort-pod56b4a1f1_114f_4794_b7f2_e9383c60e33b.slice - libcontainer container kubepods-besteffort-pod56b4a1f1_114f_4794_b7f2_e9383c60e33b.slice. Jan 29 11:05:54.909317 containerd[1478]: time="2025-01-29T11:05:54.909239140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c8cf6,Uid:641c3348-e6dc-4855-869d-3befd4a68658,Namespace:kube-system,Attempt:0,}" Jan 29 11:05:54.944012 kubelet[2815]: I0129 11:05:54.943555 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrvsc\" (UniqueName: \"kubernetes.io/projected/56b4a1f1-114f-4794-b7f2-e9383c60e33b-kube-api-access-mrvsc\") pod \"tigera-operator-7bc55997bb-kdjdj\" (UID: \"56b4a1f1-114f-4794-b7f2-e9383c60e33b\") " pod="tigera-operator/tigera-operator-7bc55997bb-kdjdj" Jan 29 11:05:54.944012 kubelet[2815]: I0129 11:05:54.943600 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/56b4a1f1-114f-4794-b7f2-e9383c60e33b-var-lib-calico\") pod \"tigera-operator-7bc55997bb-kdjdj\" (UID: \"56b4a1f1-114f-4794-b7f2-e9383c60e33b\") " pod="tigera-operator/tigera-operator-7bc55997bb-kdjdj" Jan 29 11:05:54.950389 containerd[1478]: time="2025-01-29T11:05:54.948452521Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:05:54.950389 containerd[1478]: time="2025-01-29T11:05:54.948516761Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:05:54.950389 containerd[1478]: time="2025-01-29T11:05:54.948532961Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:54.950389 containerd[1478]: time="2025-01-29T11:05:54.948621162Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:54.978094 systemd[1]: Started cri-containerd-3a1d0d34d02818bc4006c26b8358b555021ece267694255ec944366123dd0d3c.scope - libcontainer container 3a1d0d34d02818bc4006c26b8358b555021ece267694255ec944366123dd0d3c. Jan 29 11:05:55.009874 containerd[1478]: time="2025-01-29T11:05:55.009783060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c8cf6,Uid:641c3348-e6dc-4855-869d-3befd4a68658,Namespace:kube-system,Attempt:0,} returns sandbox id \"3a1d0d34d02818bc4006c26b8358b555021ece267694255ec944366123dd0d3c\"" Jan 29 11:05:55.013632 containerd[1478]: time="2025-01-29T11:05:55.013525753Z" level=info msg="CreateContainer within sandbox \"3a1d0d34d02818bc4006c26b8358b555021ece267694255ec944366123dd0d3c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 29 11:05:55.027485 containerd[1478]: time="2025-01-29T11:05:55.027405400Z" level=info msg="CreateContainer within sandbox \"3a1d0d34d02818bc4006c26b8358b555021ece267694255ec944366123dd0d3c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"414d7e6a1e3d86c61fe6407c3a29be8d2f08a627d9ad346d78bdd879a38c518b\"" Jan 29 11:05:55.030354 containerd[1478]: time="2025-01-29T11:05:55.028478804Z" level=info msg="StartContainer for \"414d7e6a1e3d86c61fe6407c3a29be8d2f08a627d9ad346d78bdd879a38c518b\"" Jan 29 11:05:55.072071 systemd[1]: Started cri-containerd-414d7e6a1e3d86c61fe6407c3a29be8d2f08a627d9ad346d78bdd879a38c518b.scope - libcontainer container 414d7e6a1e3d86c61fe6407c3a29be8d2f08a627d9ad346d78bdd879a38c518b. Jan 29 11:05:55.104478 containerd[1478]: time="2025-01-29T11:05:55.104271903Z" level=info msg="StartContainer for \"414d7e6a1e3d86c61fe6407c3a29be8d2f08a627d9ad346d78bdd879a38c518b\" returns successfully" Jan 29 11:05:55.137620 containerd[1478]: time="2025-01-29T11:05:55.137528216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-kdjdj,Uid:56b4a1f1-114f-4794-b7f2-e9383c60e33b,Namespace:tigera-operator,Attempt:0,}" Jan 29 11:05:55.169832 containerd[1478]: time="2025-01-29T11:05:55.168932683Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:05:55.169832 containerd[1478]: time="2025-01-29T11:05:55.168990484Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:05:55.169832 containerd[1478]: time="2025-01-29T11:05:55.169001764Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:55.169832 containerd[1478]: time="2025-01-29T11:05:55.169082844Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:05:55.191014 systemd[1]: Started cri-containerd-bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d.scope - libcontainer container bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d. Jan 29 11:05:55.233788 containerd[1478]: time="2025-01-29T11:05:55.233633224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-kdjdj,Uid:56b4a1f1-114f-4794-b7f2-e9383c60e33b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d\"" Jan 29 11:05:55.238081 containerd[1478]: time="2025-01-29T11:05:55.238042559Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 29 11:05:56.876516 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1732539379.mount: Deactivated successfully. Jan 29 11:05:58.043053 containerd[1478]: time="2025-01-29T11:05:58.042972831Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:58.044486 containerd[1478]: time="2025-01-29T11:05:58.044431515Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=19124160" Jan 29 11:05:58.046050 containerd[1478]: time="2025-01-29T11:05:58.045982120Z" level=info msg="ImageCreate event name:\"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:58.053310 containerd[1478]: time="2025-01-29T11:05:58.053074300Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:05:58.055575 containerd[1478]: time="2025-01-29T11:05:58.055469587Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"19120155\" in 2.816465184s" Jan 29 11:05:58.055575 containerd[1478]: time="2025-01-29T11:05:58.055504908Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\"" Jan 29 11:05:58.059132 containerd[1478]: time="2025-01-29T11:05:58.059000638Z" level=info msg="CreateContainer within sandbox \"bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 29 11:05:58.083021 containerd[1478]: time="2025-01-29T11:05:58.082964227Z" level=info msg="CreateContainer within sandbox \"bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711\"" Jan 29 11:05:58.085663 containerd[1478]: time="2025-01-29T11:05:58.084649872Z" level=info msg="StartContainer for \"d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711\"" Jan 29 11:05:58.116179 systemd[1]: run-containerd-runc-k8s.io-d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711-runc.NqnTSJ.mount: Deactivated successfully. Jan 29 11:05:58.127402 systemd[1]: Started cri-containerd-d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711.scope - libcontainer container d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711. Jan 29 11:05:58.160101 containerd[1478]: time="2025-01-29T11:05:58.160055891Z" level=info msg="StartContainer for \"d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711\" returns successfully" Jan 29 11:05:58.802890 kubelet[2815]: I0129 11:05:58.802666 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-c8cf6" podStartSLOduration=4.802636515 podStartE2EDuration="4.802636515s" podCreationTimestamp="2025-01-29 11:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 11:05:55.785413188 +0000 UTC m=+15.233769334" watchObservedRunningTime="2025-01-29 11:05:58.802636515 +0000 UTC m=+18.250992701" Jan 29 11:05:58.803712 kubelet[2815]: I0129 11:05:58.802989 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-kdjdj" podStartSLOduration=1.982828359 podStartE2EDuration="4.802972836s" podCreationTimestamp="2025-01-29 11:05:54 +0000 UTC" firstStartedPulling="2025-01-29 11:05:55.236226713 +0000 UTC m=+14.684582819" lastFinishedPulling="2025-01-29 11:05:58.05637119 +0000 UTC m=+17.504727296" observedRunningTime="2025-01-29 11:05:58.80094747 +0000 UTC m=+18.249303656" watchObservedRunningTime="2025-01-29 11:05:58.802972836 +0000 UTC m=+18.251329022" Jan 29 11:06:01.732490 kubelet[2815]: I0129 11:06:01.732406 2815 topology_manager.go:215] "Topology Admit Handler" podUID="22b30d62-7cc1-4383-8ded-e5aac57cff28" podNamespace="calico-system" podName="calico-typha-b94d647c7-j7qvc" Jan 29 11:06:01.743542 systemd[1]: Created slice kubepods-besteffort-pod22b30d62_7cc1_4383_8ded_e5aac57cff28.slice - libcontainer container kubepods-besteffort-pod22b30d62_7cc1_4383_8ded_e5aac57cff28.slice. Jan 29 11:06:01.864378 kubelet[2815]: I0129 11:06:01.861890 2815 topology_manager.go:215] "Topology Admit Handler" podUID="017b8051-dcdc-4c0d-a2ab-b0301e12bd1a" podNamespace="calico-system" podName="calico-node-ztccw" Jan 29 11:06:01.874920 systemd[1]: Created slice kubepods-besteffort-pod017b8051_dcdc_4c0d_a2ab_b0301e12bd1a.slice - libcontainer container kubepods-besteffort-pod017b8051_dcdc_4c0d_a2ab_b0301e12bd1a.slice. Jan 29 11:06:01.885223 kubelet[2815]: I0129 11:06:01.885161 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22b30d62-7cc1-4383-8ded-e5aac57cff28-tigera-ca-bundle\") pod \"calico-typha-b94d647c7-j7qvc\" (UID: \"22b30d62-7cc1-4383-8ded-e5aac57cff28\") " pod="calico-system/calico-typha-b94d647c7-j7qvc" Jan 29 11:06:01.885664 kubelet[2815]: I0129 11:06:01.885636 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/22b30d62-7cc1-4383-8ded-e5aac57cff28-typha-certs\") pod \"calico-typha-b94d647c7-j7qvc\" (UID: \"22b30d62-7cc1-4383-8ded-e5aac57cff28\") " pod="calico-system/calico-typha-b94d647c7-j7qvc" Jan 29 11:06:01.885706 kubelet[2815]: I0129 11:06:01.885681 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqtx\" (UniqueName: \"kubernetes.io/projected/22b30d62-7cc1-4383-8ded-e5aac57cff28-kube-api-access-bcqtx\") pod \"calico-typha-b94d647c7-j7qvc\" (UID: \"22b30d62-7cc1-4383-8ded-e5aac57cff28\") " pod="calico-system/calico-typha-b94d647c7-j7qvc" Jan 29 11:06:01.987520 kubelet[2815]: I0129 11:06:01.986404 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-cni-bin-dir\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987520 kubelet[2815]: I0129 11:06:01.986456 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-cni-net-dir\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987520 kubelet[2815]: I0129 11:06:01.986473 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-flexvol-driver-host\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987520 kubelet[2815]: I0129 11:06:01.986502 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-lib-modules\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987520 kubelet[2815]: I0129 11:06:01.986519 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-policysync\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987739 kubelet[2815]: I0129 11:06:01.986534 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-cni-log-dir\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987739 kubelet[2815]: I0129 11:06:01.986550 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-var-run-calico\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987739 kubelet[2815]: I0129 11:06:01.986613 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vk2\" (UniqueName: \"kubernetes.io/projected/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-kube-api-access-f8vk2\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987739 kubelet[2815]: I0129 11:06:01.986634 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-tigera-ca-bundle\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987739 kubelet[2815]: I0129 11:06:01.986649 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-node-certs\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987893 kubelet[2815]: I0129 11:06:01.986676 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-xtables-lock\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:01.987893 kubelet[2815]: I0129 11:06:01.986690 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/017b8051-dcdc-4c0d-a2ab-b0301e12bd1a-var-lib-calico\") pod \"calico-node-ztccw\" (UID: \"017b8051-dcdc-4c0d-a2ab-b0301e12bd1a\") " pod="calico-system/calico-node-ztccw" Jan 29 11:06:02.048273 containerd[1478]: time="2025-01-29T11:06:02.048230839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b94d647c7-j7qvc,Uid:22b30d62-7cc1-4383-8ded-e5aac57cff28,Namespace:calico-system,Attempt:0,}" Jan 29 11:06:02.051352 kubelet[2815]: I0129 11:06:02.051292 2815 topology_manager.go:215] "Topology Admit Handler" podUID="66d59454-c196-4ace-a57f-96550c417a39" podNamespace="calico-system" podName="csi-node-driver-mtvgj" Jan 29 11:06:02.051875 kubelet[2815]: E0129 11:06:02.051843 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:02.085264 containerd[1478]: time="2025-01-29T11:06:02.084576722Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:02.085264 containerd[1478]: time="2025-01-29T11:06:02.085143084Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:02.086397 containerd[1478]: time="2025-01-29T11:06:02.086341246Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:02.088162 containerd[1478]: time="2025-01-29T11:06:02.086808567Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:02.090288 kubelet[2815]: E0129 11:06:02.090264 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.090470 kubelet[2815]: W0129 11:06:02.090401 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.090470 kubelet[2815]: E0129 11:06:02.090435 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.091012 kubelet[2815]: E0129 11:06:02.090900 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.091012 kubelet[2815]: W0129 11:06:02.090916 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.091012 kubelet[2815]: E0129 11:06:02.090929 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.093008 kubelet[2815]: E0129 11:06:02.092900 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.093008 kubelet[2815]: W0129 11:06:02.092930 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.093008 kubelet[2815]: E0129 11:06:02.092957 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.103909 kubelet[2815]: E0129 11:06:02.102521 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.103909 kubelet[2815]: W0129 11:06:02.102547 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.103909 kubelet[2815]: E0129 11:06:02.102634 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.103909 kubelet[2815]: E0129 11:06:02.103533 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.103909 kubelet[2815]: W0129 11:06:02.103555 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.103909 kubelet[2815]: E0129 11:06:02.103592 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.106140 kubelet[2815]: E0129 11:06:02.104805 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.106140 kubelet[2815]: W0129 11:06:02.104845 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.106140 kubelet[2815]: E0129 11:06:02.104880 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.106140 kubelet[2815]: E0129 11:06:02.105533 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.106140 kubelet[2815]: W0129 11:06:02.105549 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.106140 kubelet[2815]: E0129 11:06:02.105597 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.107562 kubelet[2815]: E0129 11:06:02.106943 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.107562 kubelet[2815]: W0129 11:06:02.106973 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.107562 kubelet[2815]: E0129 11:06:02.107006 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.107562 kubelet[2815]: E0129 11:06:02.107187 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.107562 kubelet[2815]: W0129 11:06:02.107198 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.107562 kubelet[2815]: E0129 11:06:02.107219 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.110219 kubelet[2815]: E0129 11:06:02.108358 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.110219 kubelet[2815]: W0129 11:06:02.108387 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.110219 kubelet[2815]: E0129 11:06:02.108428 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.110219 kubelet[2815]: E0129 11:06:02.110075 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.110219 kubelet[2815]: W0129 11:06:02.110102 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.110219 kubelet[2815]: E0129 11:06:02.110135 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.110551 kubelet[2815]: E0129 11:06:02.110535 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.110765 kubelet[2815]: W0129 11:06:02.110609 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.110765 kubelet[2815]: E0129 11:06:02.110647 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.112062 kubelet[2815]: E0129 11:06:02.111021 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.112062 kubelet[2815]: W0129 11:06:02.111134 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.112062 kubelet[2815]: E0129 11:06:02.111161 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.112062 kubelet[2815]: E0129 11:06:02.111710 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.112062 kubelet[2815]: W0129 11:06:02.111726 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.112062 kubelet[2815]: E0129 11:06:02.111844 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.113735 kubelet[2815]: E0129 11:06:02.112576 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.113735 kubelet[2815]: W0129 11:06:02.112594 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.113735 kubelet[2815]: E0129 11:06:02.112628 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.113735 kubelet[2815]: E0129 11:06:02.113122 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.113735 kubelet[2815]: W0129 11:06:02.113137 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.113735 kubelet[2815]: E0129 11:06:02.113167 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.113735 kubelet[2815]: E0129 11:06:02.113468 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.113735 kubelet[2815]: W0129 11:06:02.113481 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.113735 kubelet[2815]: E0129 11:06:02.113506 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.115058 kubelet[2815]: E0129 11:06:02.115032 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.115235 kubelet[2815]: W0129 11:06:02.115115 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.115235 kubelet[2815]: E0129 11:06:02.115160 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.115385 kubelet[2815]: E0129 11:06:02.115371 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.116591 kubelet[2815]: W0129 11:06:02.116556 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.116789 kubelet[2815]: E0129 11:06:02.116733 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.117065 kubelet[2815]: E0129 11:06:02.117044 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.118913 kubelet[2815]: W0129 11:06:02.118881 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.119069 kubelet[2815]: E0129 11:06:02.119036 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.119437 kubelet[2815]: E0129 11:06:02.119419 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.119524 kubelet[2815]: W0129 11:06:02.119508 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.119610 kubelet[2815]: E0129 11:06:02.119583 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.121362 kubelet[2815]: E0129 11:06:02.121331 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.121362 kubelet[2815]: W0129 11:06:02.121358 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.121641 kubelet[2815]: E0129 11:06:02.121612 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.122109 kubelet[2815]: E0129 11:06:02.122073 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.122109 kubelet[2815]: W0129 11:06:02.122097 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.122269 kubelet[2815]: E0129 11:06:02.122249 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.123102 kubelet[2815]: E0129 11:06:02.123078 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.123102 kubelet[2815]: W0129 11:06:02.123097 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.123277 kubelet[2815]: E0129 11:06:02.123251 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.123847 kubelet[2815]: E0129 11:06:02.123740 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.123847 kubelet[2815]: W0129 11:06:02.123805 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.124223 kubelet[2815]: E0129 11:06:02.124186 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.125554 kubelet[2815]: E0129 11:06:02.124344 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.125554 kubelet[2815]: W0129 11:06:02.124358 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.125554 kubelet[2815]: E0129 11:06:02.124855 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.125554 kubelet[2815]: E0129 11:06:02.125010 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.125554 kubelet[2815]: W0129 11:06:02.125019 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.125554 kubelet[2815]: E0129 11:06:02.125097 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.125554 kubelet[2815]: E0129 11:06:02.125188 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.125554 kubelet[2815]: W0129 11:06:02.125194 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.125554 kubelet[2815]: E0129 11:06:02.125260 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.130867 kubelet[2815]: E0129 11:06:02.127390 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.130867 kubelet[2815]: W0129 11:06:02.127424 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.130867 kubelet[2815]: E0129 11:06:02.129785 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.130867 kubelet[2815]: W0129 11:06:02.129810 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.130867 kubelet[2815]: E0129 11:06:02.130103 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.130867 kubelet[2815]: W0129 11:06:02.130112 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.130867 kubelet[2815]: E0129 11:06:02.130232 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.130867 kubelet[2815]: W0129 11:06:02.130238 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.130867 kubelet[2815]: E0129 11:06:02.130343 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.130867 kubelet[2815]: W0129 11:06:02.130349 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.130867 kubelet[2815]: E0129 11:06:02.130448 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.131259 kubelet[2815]: W0129 11:06:02.130454 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.131259 kubelet[2815]: E0129 11:06:02.130588 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.131259 kubelet[2815]: W0129 11:06:02.130596 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.131326 kubelet[2815]: E0129 11:06:02.131264 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.131326 kubelet[2815]: W0129 11:06:02.131277 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.131524 kubelet[2815]: E0129 11:06:02.131490 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.131524 kubelet[2815]: W0129 11:06:02.131505 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.131883 kubelet[2815]: E0129 11:06:02.131644 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.131883 kubelet[2815]: W0129 11:06:02.131656 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.132024 kubelet[2815]: E0129 11:06:02.131987 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.132024 kubelet[2815]: W0129 11:06:02.132001 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.132077 kubelet[2815]: E0129 11:06:02.132028 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.140850 kubelet[2815]: E0129 11:06:02.140430 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.143995 kubelet[2815]: E0129 11:06:02.143789 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.143995 kubelet[2815]: E0129 11:06:02.143944 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.143995 kubelet[2815]: W0129 11:06:02.143954 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.143995 kubelet[2815]: E0129 11:06:02.143967 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.144896 kubelet[2815]: E0129 11:06:02.144353 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.144896 kubelet[2815]: E0129 11:06:02.144425 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.144896 kubelet[2815]: E0129 11:06:02.144454 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.144896 kubelet[2815]: E0129 11:06:02.144480 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.146663 kubelet[2815]: E0129 11:06:02.146620 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.146663 kubelet[2815]: W0129 11:06:02.146647 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.146663 kubelet[2815]: E0129 11:06:02.146667 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.150155 kubelet[2815]: E0129 11:06:02.149918 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.150155 kubelet[2815]: W0129 11:06:02.149942 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.150155 kubelet[2815]: E0129 11:06:02.149965 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.150352 kubelet[2815]: E0129 11:06:02.150190 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.150352 kubelet[2815]: E0129 11:06:02.150227 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.150352 kubelet[2815]: E0129 11:06:02.150245 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.150352 kubelet[2815]: E0129 11:06:02.150260 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.150499 kubelet[2815]: E0129 11:06:02.150434 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.150499 kubelet[2815]: W0129 11:06:02.150444 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.150499 kubelet[2815]: E0129 11:06:02.150459 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.153017 kubelet[2815]: E0129 11:06:02.151030 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.153017 kubelet[2815]: W0129 11:06:02.151056 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.153017 kubelet[2815]: E0129 11:06:02.151083 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.153688 kubelet[2815]: E0129 11:06:02.153405 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.153688 kubelet[2815]: W0129 11:06:02.153603 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.153688 kubelet[2815]: E0129 11:06:02.153628 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.156593 kubelet[2815]: E0129 11:06:02.156559 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.156593 kubelet[2815]: W0129 11:06:02.156586 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.156725 kubelet[2815]: E0129 11:06:02.156608 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.158008 kubelet[2815]: E0129 11:06:02.157977 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.158008 kubelet[2815]: W0129 11:06:02.158001 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.158142 kubelet[2815]: E0129 11:06:02.158019 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.160657 kubelet[2815]: E0129 11:06:02.158219 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.160657 kubelet[2815]: W0129 11:06:02.158235 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.160657 kubelet[2815]: E0129 11:06:02.158250 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.160657 kubelet[2815]: E0129 11:06:02.158904 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.160657 kubelet[2815]: W0129 11:06:02.158931 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.160657 kubelet[2815]: E0129 11:06:02.158945 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.160657 kubelet[2815]: E0129 11:06:02.159932 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.160657 kubelet[2815]: W0129 11:06:02.159953 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.160657 kubelet[2815]: E0129 11:06:02.159967 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.164945 kubelet[2815]: E0129 11:06:02.163987 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.164945 kubelet[2815]: W0129 11:06:02.164015 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.164945 kubelet[2815]: E0129 11:06:02.164545 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.164945 kubelet[2815]: W0129 11:06:02.164556 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.165143 kubelet[2815]: E0129 11:06:02.165013 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.165143 kubelet[2815]: W0129 11:06:02.165023 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.166187 kubelet[2815]: E0129 11:06:02.165890 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.166187 kubelet[2815]: W0129 11:06:02.165909 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.166187 kubelet[2815]: E0129 11:06:02.165925 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.166187 kubelet[2815]: E0129 11:06:02.166057 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.166514 kubelet[2815]: E0129 11:06:02.166477 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.166514 kubelet[2815]: W0129 11:06:02.166502 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.166950 systemd[1]: Started cri-containerd-04f41d7a6ceaf3139123081ad39d85c365130acf223e1245774637d7da6939b9.scope - libcontainer container 04f41d7a6ceaf3139123081ad39d85c365130acf223e1245774637d7da6939b9. Jan 29 11:06:02.167500 kubelet[2815]: E0129 11:06:02.167100 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.167500 kubelet[2815]: E0129 11:06:02.167454 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.167500 kubelet[2815]: W0129 11:06:02.167468 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.167500 kubelet[2815]: E0129 11:06:02.167479 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.169341 kubelet[2815]: E0129 11:06:02.167982 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.169341 kubelet[2815]: E0129 11:06:02.168043 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.171310 kubelet[2815]: E0129 11:06:02.171252 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.171310 kubelet[2815]: W0129 11:06:02.171304 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.171452 kubelet[2815]: E0129 11:06:02.171325 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.171532 kubelet[2815]: E0129 11:06:02.171491 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.171532 kubelet[2815]: W0129 11:06:02.171523 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.171532 kubelet[2815]: E0129 11:06:02.171533 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.171993 kubelet[2815]: E0129 11:06:02.171679 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.171993 kubelet[2815]: W0129 11:06:02.171696 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.171993 kubelet[2815]: E0129 11:06:02.171704 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.171993 kubelet[2815]: E0129 11:06:02.171891 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.171993 kubelet[2815]: W0129 11:06:02.171912 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.171993 kubelet[2815]: E0129 11:06:02.171922 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.172164 kubelet[2815]: E0129 11:06:02.172097 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.172164 kubelet[2815]: W0129 11:06:02.172106 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.172164 kubelet[2815]: E0129 11:06:02.172119 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.172852 kubelet[2815]: E0129 11:06:02.172404 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.172852 kubelet[2815]: W0129 11:06:02.172419 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.172852 kubelet[2815]: E0129 11:06:02.172435 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.173168 kubelet[2815]: E0129 11:06:02.172891 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.173168 kubelet[2815]: W0129 11:06:02.172993 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.173168 kubelet[2815]: E0129 11:06:02.173011 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.173894 kubelet[2815]: E0129 11:06:02.173346 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.173894 kubelet[2815]: W0129 11:06:02.173362 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.173894 kubelet[2815]: E0129 11:06:02.173393 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.173894 kubelet[2815]: E0129 11:06:02.173759 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.173894 kubelet[2815]: W0129 11:06:02.173771 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.174651 kubelet[2815]: E0129 11:06:02.174007 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.174651 kubelet[2815]: E0129 11:06:02.174392 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.174651 kubelet[2815]: W0129 11:06:02.174572 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.174651 kubelet[2815]: E0129 11:06:02.174587 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.175200 kubelet[2815]: E0129 11:06:02.175064 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.175200 kubelet[2815]: W0129 11:06:02.175086 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.175200 kubelet[2815]: E0129 11:06:02.175098 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.176181 kubelet[2815]: E0129 11:06:02.175975 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.176181 kubelet[2815]: W0129 11:06:02.175992 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.176181 kubelet[2815]: E0129 11:06:02.176008 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.181033 containerd[1478]: time="2025-01-29T11:06:02.180986943Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ztccw,Uid:017b8051-dcdc-4c0d-a2ab-b0301e12bd1a,Namespace:calico-system,Attempt:0,}" Jan 29 11:06:02.188599 kubelet[2815]: E0129 11:06:02.188542 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.188599 kubelet[2815]: W0129 11:06:02.188578 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.188599 kubelet[2815]: E0129 11:06:02.188600 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.188860 kubelet[2815]: I0129 11:06:02.188666 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66d59454-c196-4ace-a57f-96550c417a39-registration-dir\") pod \"csi-node-driver-mtvgj\" (UID: \"66d59454-c196-4ace-a57f-96550c417a39\") " pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:02.189153 kubelet[2815]: E0129 11:06:02.189125 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.189153 kubelet[2815]: W0129 11:06:02.189144 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.189439 kubelet[2815]: E0129 11:06:02.189157 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.189439 kubelet[2815]: E0129 11:06:02.189433 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.189560 kubelet[2815]: W0129 11:06:02.189447 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.189560 kubelet[2815]: E0129 11:06:02.189458 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.189715 kubelet[2815]: E0129 11:06:02.189686 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.189715 kubelet[2815]: W0129 11:06:02.189708 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.189798 kubelet[2815]: E0129 11:06:02.189718 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.189981 kubelet[2815]: I0129 11:06:02.189956 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/66d59454-c196-4ace-a57f-96550c417a39-varrun\") pod \"csi-node-driver-mtvgj\" (UID: \"66d59454-c196-4ace-a57f-96550c417a39\") " pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:02.190041 kubelet[2815]: E0129 11:06:02.190032 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.190041 kubelet[2815]: W0129 11:06:02.190040 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.190089 kubelet[2815]: E0129 11:06:02.190057 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.190269 kubelet[2815]: E0129 11:06:02.190246 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.190269 kubelet[2815]: W0129 11:06:02.190262 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.190339 kubelet[2815]: E0129 11:06:02.190285 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.190946 kubelet[2815]: E0129 11:06:02.190466 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.190946 kubelet[2815]: W0129 11:06:02.190479 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.190946 kubelet[2815]: E0129 11:06:02.190488 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.190946 kubelet[2815]: I0129 11:06:02.190511 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mz9f\" (UniqueName: \"kubernetes.io/projected/66d59454-c196-4ace-a57f-96550c417a39-kube-api-access-9mz9f\") pod \"csi-node-driver-mtvgj\" (UID: \"66d59454-c196-4ace-a57f-96550c417a39\") " pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:02.190946 kubelet[2815]: E0129 11:06:02.190770 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.190946 kubelet[2815]: W0129 11:06:02.190783 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.191196 kubelet[2815]: E0129 11:06:02.190808 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.191196 kubelet[2815]: I0129 11:06:02.191078 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66d59454-c196-4ace-a57f-96550c417a39-kubelet-dir\") pod \"csi-node-driver-mtvgj\" (UID: \"66d59454-c196-4ace-a57f-96550c417a39\") " pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:02.191196 kubelet[2815]: E0129 11:06:02.191017 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.191196 kubelet[2815]: W0129 11:06:02.191104 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.191615 kubelet[2815]: E0129 11:06:02.191113 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.191677 kubelet[2815]: E0129 11:06:02.191652 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.191677 kubelet[2815]: W0129 11:06:02.191671 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.191752 kubelet[2815]: E0129 11:06:02.191691 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.192498 kubelet[2815]: E0129 11:06:02.192363 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.192498 kubelet[2815]: W0129 11:06:02.192497 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.192623 kubelet[2815]: E0129 11:06:02.192523 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.192623 kubelet[2815]: I0129 11:06:02.192545 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66d59454-c196-4ace-a57f-96550c417a39-socket-dir\") pod \"csi-node-driver-mtvgj\" (UID: \"66d59454-c196-4ace-a57f-96550c417a39\") " pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:02.193208 kubelet[2815]: E0129 11:06:02.192787 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.193208 kubelet[2815]: W0129 11:06:02.192799 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.193208 kubelet[2815]: E0129 11:06:02.192809 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.193576 kubelet[2815]: E0129 11:06:02.193545 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.193576 kubelet[2815]: W0129 11:06:02.193564 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.193576 kubelet[2815]: E0129 11:06:02.193576 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.194513 kubelet[2815]: E0129 11:06:02.194237 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.194513 kubelet[2815]: W0129 11:06:02.194258 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.194513 kubelet[2815]: E0129 11:06:02.194270 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.194513 kubelet[2815]: E0129 11:06:02.194448 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.194513 kubelet[2815]: W0129 11:06:02.194456 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.194513 kubelet[2815]: E0129 11:06:02.194466 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.227368 containerd[1478]: time="2025-01-29T11:06:02.227062968Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:02.227368 containerd[1478]: time="2025-01-29T11:06:02.227133169Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:02.227368 containerd[1478]: time="2025-01-29T11:06:02.227153969Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:02.227368 containerd[1478]: time="2025-01-29T11:06:02.227242369Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:02.265270 systemd[1]: Started cri-containerd-1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5.scope - libcontainer container 1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5. Jan 29 11:06:02.283051 containerd[1478]: time="2025-01-29T11:06:02.282999577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b94d647c7-j7qvc,Uid:22b30d62-7cc1-4383-8ded-e5aac57cff28,Namespace:calico-system,Attempt:0,} returns sandbox id \"04f41d7a6ceaf3139123081ad39d85c365130acf223e1245774637d7da6939b9\"" Jan 29 11:06:02.287530 containerd[1478]: time="2025-01-29T11:06:02.286864505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 29 11:06:02.294386 kubelet[2815]: E0129 11:06:02.294341 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.294386 kubelet[2815]: W0129 11:06:02.294371 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.294386 kubelet[2815]: E0129 11:06:02.294394 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.295461 kubelet[2815]: E0129 11:06:02.294789 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.295461 kubelet[2815]: W0129 11:06:02.294805 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.295461 kubelet[2815]: E0129 11:06:02.294853 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.295461 kubelet[2815]: E0129 11:06:02.295085 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.295461 kubelet[2815]: W0129 11:06:02.295094 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.295461 kubelet[2815]: E0129 11:06:02.295113 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.295461 kubelet[2815]: E0129 11:06:02.295314 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.295461 kubelet[2815]: W0129 11:06:02.295322 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.295461 kubelet[2815]: E0129 11:06:02.295337 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.295713 kubelet[2815]: E0129 11:06:02.295553 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.295713 kubelet[2815]: W0129 11:06:02.295562 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.295713 kubelet[2815]: E0129 11:06:02.295577 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.295835 kubelet[2815]: E0129 11:06:02.295811 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.295870 kubelet[2815]: W0129 11:06:02.295838 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.296127 kubelet[2815]: E0129 11:06:02.295921 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.296127 kubelet[2815]: E0129 11:06:02.296024 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.296127 kubelet[2815]: W0129 11:06:02.296030 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.296127 kubelet[2815]: E0129 11:06:02.296118 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.296473 kubelet[2815]: E0129 11:06:02.296326 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.296473 kubelet[2815]: W0129 11:06:02.296333 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.296473 kubelet[2815]: E0129 11:06:02.296407 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.296541 kubelet[2815]: E0129 11:06:02.296505 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.296541 kubelet[2815]: W0129 11:06:02.296512 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.296807 kubelet[2815]: E0129 11:06:02.296585 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.296807 kubelet[2815]: E0129 11:06:02.296689 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.296807 kubelet[2815]: W0129 11:06:02.296695 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.296807 kubelet[2815]: E0129 11:06:02.296773 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.297414 kubelet[2815]: E0129 11:06:02.296940 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.297414 kubelet[2815]: W0129 11:06:02.296951 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.297414 kubelet[2815]: E0129 11:06:02.296964 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.297414 kubelet[2815]: E0129 11:06:02.297164 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.297414 kubelet[2815]: W0129 11:06:02.297171 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.297414 kubelet[2815]: E0129 11:06:02.297181 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.297414 kubelet[2815]: E0129 11:06:02.297403 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.297414 kubelet[2815]: W0129 11:06:02.297412 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.297600 kubelet[2815]: E0129 11:06:02.297491 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.297628 kubelet[2815]: E0129 11:06:02.297607 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.297628 kubelet[2815]: W0129 11:06:02.297614 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.298228 kubelet[2815]: E0129 11:06:02.297695 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.298228 kubelet[2815]: E0129 11:06:02.297948 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.298228 kubelet[2815]: W0129 11:06:02.297958 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.298228 kubelet[2815]: E0129 11:06:02.298040 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.298228 kubelet[2815]: E0129 11:06:02.298150 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.298228 kubelet[2815]: W0129 11:06:02.298157 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.298228 kubelet[2815]: E0129 11:06:02.298224 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.298714 kubelet[2815]: E0129 11:06:02.298309 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.298714 kubelet[2815]: W0129 11:06:02.298316 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.298714 kubelet[2815]: E0129 11:06:02.298397 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.298714 kubelet[2815]: E0129 11:06:02.298485 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.298714 kubelet[2815]: W0129 11:06:02.298494 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.298714 kubelet[2815]: E0129 11:06:02.298508 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.298714 kubelet[2815]: E0129 11:06:02.298695 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.298714 kubelet[2815]: W0129 11:06:02.298703 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.298714 kubelet[2815]: E0129 11:06:02.298713 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.299918 kubelet[2815]: E0129 11:06:02.299169 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.299918 kubelet[2815]: W0129 11:06:02.299179 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.299918 kubelet[2815]: E0129 11:06:02.299196 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.299918 kubelet[2815]: E0129 11:06:02.299419 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.299918 kubelet[2815]: W0129 11:06:02.299428 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.299918 kubelet[2815]: E0129 11:06:02.299591 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.300055 kubelet[2815]: E0129 11:06:02.300042 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.300055 kubelet[2815]: W0129 11:06:02.300052 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.300263 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.300436 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.301903 kubelet[2815]: W0129 11:06:02.300443 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.300455 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.300838 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.301903 kubelet[2815]: W0129 11:06:02.300848 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.300884 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.301121 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.301903 kubelet[2815]: W0129 11:06:02.301131 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.301903 kubelet[2815]: E0129 11:06:02.301139 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.317346 kubelet[2815]: E0129 11:06:02.317303 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:02.317346 kubelet[2815]: W0129 11:06:02.317336 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:02.317513 kubelet[2815]: E0129 11:06:02.317364 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:02.337737 containerd[1478]: time="2025-01-29T11:06:02.336944500Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ztccw,Uid:017b8051-dcdc-4c0d-a2ab-b0301e12bd1a,Namespace:calico-system,Attempt:0,} returns sandbox id \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\"" Jan 29 11:06:03.644111 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2512776421.mount: Deactivated successfully. Jan 29 11:06:03.659284 kubelet[2815]: E0129 11:06:03.658778 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:04.208719 containerd[1478]: time="2025-01-29T11:06:04.208644504Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:04.210662 containerd[1478]: time="2025-01-29T11:06:04.210498908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29231308" Jan 29 11:06:04.214085 containerd[1478]: time="2025-01-29T11:06:04.213942235Z" level=info msg="ImageCreate event name:\"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:04.217548 containerd[1478]: time="2025-01-29T11:06:04.217489482Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:04.218325 containerd[1478]: time="2025-01-29T11:06:04.218199604Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"29231162\" in 1.931291139s" Jan 29 11:06:04.218585 containerd[1478]: time="2025-01-29T11:06:04.218465844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\"" Jan 29 11:06:04.221157 containerd[1478]: time="2025-01-29T11:06:04.220854129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 29 11:06:04.236606 containerd[1478]: time="2025-01-29T11:06:04.236559161Z" level=info msg="CreateContainer within sandbox \"04f41d7a6ceaf3139123081ad39d85c365130acf223e1245774637d7da6939b9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 29 11:06:04.258246 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3523975047.mount: Deactivated successfully. Jan 29 11:06:04.265877 containerd[1478]: time="2025-01-29T11:06:04.265761099Z" level=info msg="CreateContainer within sandbox \"04f41d7a6ceaf3139123081ad39d85c365130acf223e1245774637d7da6939b9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"81c890261acc5b8d41566a34bab6b5178acb773650f4575f94f67d68d9149b5e\"" Jan 29 11:06:04.267860 containerd[1478]: time="2025-01-29T11:06:04.266435741Z" level=info msg="StartContainer for \"81c890261acc5b8d41566a34bab6b5178acb773650f4575f94f67d68d9149b5e\"" Jan 29 11:06:04.307337 systemd[1]: Started cri-containerd-81c890261acc5b8d41566a34bab6b5178acb773650f4575f94f67d68d9149b5e.scope - libcontainer container 81c890261acc5b8d41566a34bab6b5178acb773650f4575f94f67d68d9149b5e. Jan 29 11:06:04.352643 containerd[1478]: time="2025-01-29T11:06:04.352582674Z" level=info msg="StartContainer for \"81c890261acc5b8d41566a34bab6b5178acb773650f4575f94f67d68d9149b5e\" returns successfully" Jan 29 11:06:04.894557 kubelet[2815]: E0129 11:06:04.894505 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.894557 kubelet[2815]: W0129 11:06:04.894545 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.894557 kubelet[2815]: E0129 11:06:04.894583 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.895013 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896140 kubelet[2815]: W0129 11:06:04.895033 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.895053 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.895313 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896140 kubelet[2815]: W0129 11:06:04.895328 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.895344 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.895605 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896140 kubelet[2815]: W0129 11:06:04.895619 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.895634 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896140 kubelet[2815]: E0129 11:06:04.896014 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896900 kubelet[2815]: W0129 11:06:04.896030 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.896900 kubelet[2815]: E0129 11:06:04.896041 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896900 kubelet[2815]: E0129 11:06:04.896203 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896900 kubelet[2815]: W0129 11:06:04.896211 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.896900 kubelet[2815]: E0129 11:06:04.896219 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896900 kubelet[2815]: E0129 11:06:04.896344 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896900 kubelet[2815]: W0129 11:06:04.896352 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.896900 kubelet[2815]: E0129 11:06:04.896360 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.896900 kubelet[2815]: E0129 11:06:04.896487 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.896900 kubelet[2815]: W0129 11:06:04.896494 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.896501 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.896643 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.898260 kubelet[2815]: W0129 11:06:04.896654 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.896661 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.896826 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.898260 kubelet[2815]: W0129 11:06:04.896849 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.896859 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.896998 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.898260 kubelet[2815]: W0129 11:06:04.897006 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.898260 kubelet[2815]: E0129 11:06:04.897013 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.897200 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.899005 kubelet[2815]: W0129 11:06:04.897208 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.897216 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.897461 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.899005 kubelet[2815]: W0129 11:06:04.897469 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.897479 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.897661 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.899005 kubelet[2815]: W0129 11:06:04.897669 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.897677 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.899005 kubelet[2815]: E0129 11:06:04.898051 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.899517 kubelet[2815]: W0129 11:06:04.898064 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.899517 kubelet[2815]: E0129 11:06:04.898074 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.916922 kubelet[2815]: E0129 11:06:04.916764 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.916922 kubelet[2815]: W0129 11:06:04.916842 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.916922 kubelet[2815]: E0129 11:06:04.916871 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.917364 kubelet[2815]: E0129 11:06:04.917282 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.917364 kubelet[2815]: W0129 11:06:04.917293 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.917364 kubelet[2815]: E0129 11:06:04.917315 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.917571 kubelet[2815]: E0129 11:06:04.917555 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.917571 kubelet[2815]: W0129 11:06:04.917568 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.917638 kubelet[2815]: E0129 11:06:04.917589 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.918313 kubelet[2815]: E0129 11:06:04.917866 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.918313 kubelet[2815]: W0129 11:06:04.917885 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.918313 kubelet[2815]: E0129 11:06:04.917898 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.918313 kubelet[2815]: E0129 11:06:04.918177 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.918313 kubelet[2815]: W0129 11:06:04.918188 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.918313 kubelet[2815]: E0129 11:06:04.918251 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.918543 kubelet[2815]: E0129 11:06:04.918434 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.918543 kubelet[2815]: W0129 11:06:04.918443 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.918543 kubelet[2815]: E0129 11:06:04.918454 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.918717 kubelet[2815]: E0129 11:06:04.918624 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.918717 kubelet[2815]: W0129 11:06:04.918643 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.918717 kubelet[2815]: E0129 11:06:04.918653 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.919202 kubelet[2815]: E0129 11:06:04.918989 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.919202 kubelet[2815]: W0129 11:06:04.919004 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.919202 kubelet[2815]: E0129 11:06:04.919018 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.919517 kubelet[2815]: E0129 11:06:04.919410 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.919517 kubelet[2815]: W0129 11:06:04.919425 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.919517 kubelet[2815]: E0129 11:06:04.919443 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.919956 kubelet[2815]: E0129 11:06:04.919728 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.919956 kubelet[2815]: W0129 11:06:04.919791 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.919956 kubelet[2815]: E0129 11:06:04.919811 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.920186 kubelet[2815]: E0129 11:06:04.920171 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.920249 kubelet[2815]: W0129 11:06:04.920237 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.920495 kubelet[2815]: E0129 11:06:04.920308 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.920572 kubelet[2815]: E0129 11:06:04.920549 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.920572 kubelet[2815]: W0129 11:06:04.920568 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.920639 kubelet[2815]: E0129 11:06:04.920594 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.921001 kubelet[2815]: E0129 11:06:04.920975 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.921001 kubelet[2815]: W0129 11:06:04.920999 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.921122 kubelet[2815]: E0129 11:06:04.921031 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.921537 kubelet[2815]: E0129 11:06:04.921405 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.921537 kubelet[2815]: W0129 11:06:04.921447 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.921537 kubelet[2815]: E0129 11:06:04.921479 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.921985 kubelet[2815]: E0129 11:06:04.921882 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.921985 kubelet[2815]: W0129 11:06:04.921901 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.921985 kubelet[2815]: E0129 11:06:04.921922 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.922337 kubelet[2815]: E0129 11:06:04.922264 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.922337 kubelet[2815]: W0129 11:06:04.922295 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.922337 kubelet[2815]: E0129 11:06:04.922316 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.923537 kubelet[2815]: E0129 11:06:04.923357 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.923537 kubelet[2815]: W0129 11:06:04.923398 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.923537 kubelet[2815]: E0129 11:06:04.923452 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:04.923926 kubelet[2815]: E0129 11:06:04.923694 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:04.923926 kubelet[2815]: W0129 11:06:04.923710 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:04.923926 kubelet[2815]: E0129 11:06:04.923725 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.614039 containerd[1478]: time="2025-01-29T11:06:05.613948048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:05.615673 containerd[1478]: time="2025-01-29T11:06:05.615586371Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5117811" Jan 29 11:06:05.619440 containerd[1478]: time="2025-01-29T11:06:05.619354938Z" level=info msg="ImageCreate event name:\"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:05.625050 containerd[1478]: time="2025-01-29T11:06:05.624946029Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:05.628804 containerd[1478]: time="2025-01-29T11:06:05.628457075Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6487425\" in 1.407521866s" Jan 29 11:06:05.628804 containerd[1478]: time="2025-01-29T11:06:05.628545716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\"" Jan 29 11:06:05.634833 containerd[1478]: time="2025-01-29T11:06:05.634678607Z" level=info msg="CreateContainer within sandbox \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 29 11:06:05.658395 kubelet[2815]: E0129 11:06:05.658306 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:05.662645 containerd[1478]: time="2025-01-29T11:06:05.662516019Z" level=info msg="CreateContainer within sandbox \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e\"" Jan 29 11:06:05.663537 containerd[1478]: time="2025-01-29T11:06:05.663484221Z" level=info msg="StartContainer for \"9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e\"" Jan 29 11:06:05.735432 systemd[1]: Started cri-containerd-9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e.scope - libcontainer container 9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e. Jan 29 11:06:05.784543 containerd[1478]: time="2025-01-29T11:06:05.784466288Z" level=info msg="StartContainer for \"9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e\" returns successfully" Jan 29 11:06:05.805121 kubelet[2815]: I0129 11:06:05.803893 2815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 11:06:05.805901 kubelet[2815]: E0129 11:06:05.805855 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:05.806134 kubelet[2815]: W0129 11:06:05.806085 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:05.806184 kubelet[2815]: E0129 11:06:05.806138 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.808066 kubelet[2815]: E0129 11:06:05.808010 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:05.808066 kubelet[2815]: W0129 11:06:05.808046 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:05.808388 kubelet[2815]: E0129 11:06:05.808083 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.808599 kubelet[2815]: E0129 11:06:05.808499 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:05.808599 kubelet[2815]: W0129 11:06:05.808510 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:05.808599 kubelet[2815]: E0129 11:06:05.808522 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.809088 kubelet[2815]: E0129 11:06:05.809066 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:05.809088 kubelet[2815]: W0129 11:06:05.809084 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:05.809656 kubelet[2815]: E0129 11:06:05.809098 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.809656 kubelet[2815]: E0129 11:06:05.809472 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:05.809656 kubelet[2815]: W0129 11:06:05.809489 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:05.809656 kubelet[2815]: E0129 11:06:05.809500 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.809907 kubelet[2815]: E0129 11:06:05.809892 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 11:06:05.809907 kubelet[2815]: W0129 11:06:05.809906 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 11:06:05.809907 kubelet[2815]: E0129 11:06:05.809918 2815 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 11:06:05.818903 systemd[1]: cri-containerd-9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e.scope: Deactivated successfully. Jan 29 11:06:05.844443 kubelet[2815]: I0129 11:06:05.842759 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-b94d647c7-j7qvc" podStartSLOduration=2.908354254 podStartE2EDuration="4.842711078s" podCreationTimestamp="2025-01-29 11:06:01 +0000 UTC" firstStartedPulling="2025-01-29 11:06:02.285690823 +0000 UTC m=+21.734046929" lastFinishedPulling="2025-01-29 11:06:04.220047647 +0000 UTC m=+23.668403753" observedRunningTime="2025-01-29 11:06:04.820751375 +0000 UTC m=+24.269107481" watchObservedRunningTime="2025-01-29 11:06:05.842711078 +0000 UTC m=+25.291067184" Jan 29 11:06:05.952597 containerd[1478]: time="2025-01-29T11:06:05.952480124Z" level=info msg="shim disconnected" id=9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e namespace=k8s.io Jan 29 11:06:05.953234 containerd[1478]: time="2025-01-29T11:06:05.952646644Z" level=warning msg="cleaning up after shim disconnected" id=9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e namespace=k8s.io Jan 29 11:06:05.953234 containerd[1478]: time="2025-01-29T11:06:05.952660004Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 11:06:06.226068 systemd[1]: run-containerd-runc-k8s.io-9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e-runc.z3S8qC.mount: Deactivated successfully. Jan 29 11:06:06.226228 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9f23b30aae39dbff86af3d5f4ccd26d72dcfbe27012a70597964019e247e248e-rootfs.mount: Deactivated successfully. Jan 29 11:06:06.817029 containerd[1478]: time="2025-01-29T11:06:06.816551761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 29 11:06:07.658683 kubelet[2815]: E0129 11:06:07.658576 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:09.545229 containerd[1478]: time="2025-01-29T11:06:09.545061967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:09.547070 containerd[1478]: time="2025-01-29T11:06:09.546963050Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=89703123" Jan 29 11:06:09.547892 containerd[1478]: time="2025-01-29T11:06:09.547731171Z" level=info msg="ImageCreate event name:\"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:09.551808 containerd[1478]: time="2025-01-29T11:06:09.550798935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:09.551808 containerd[1478]: time="2025-01-29T11:06:09.551630256Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"91072777\" in 2.735018294s" Jan 29 11:06:09.551808 containerd[1478]: time="2025-01-29T11:06:09.551663737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\"" Jan 29 11:06:09.556962 containerd[1478]: time="2025-01-29T11:06:09.556902784Z" level=info msg="CreateContainer within sandbox \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 29 11:06:09.578393 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount644596014.mount: Deactivated successfully. Jan 29 11:06:09.586348 containerd[1478]: time="2025-01-29T11:06:09.586271145Z" level=info msg="CreateContainer within sandbox \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530\"" Jan 29 11:06:09.588451 containerd[1478]: time="2025-01-29T11:06:09.588217067Z" level=info msg="StartContainer for \"e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530\"" Jan 29 11:06:09.635142 systemd[1]: Started cri-containerd-e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530.scope - libcontainer container e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530. Jan 29 11:06:09.659918 kubelet[2815]: E0129 11:06:09.659403 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:09.681905 containerd[1478]: time="2025-01-29T11:06:09.681848637Z" level=info msg="StartContainer for \"e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530\" returns successfully" Jan 29 11:06:10.214957 containerd[1478]: time="2025-01-29T11:06:10.214620592Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 29 11:06:10.217861 systemd[1]: cri-containerd-e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530.scope: Deactivated successfully. Jan 29 11:06:10.321069 kubelet[2815]: I0129 11:06:10.320896 2815 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 29 11:06:10.356038 containerd[1478]: time="2025-01-29T11:06:10.355695332Z" level=info msg="shim disconnected" id=e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530 namespace=k8s.io Jan 29 11:06:10.356038 containerd[1478]: time="2025-01-29T11:06:10.355777652Z" level=warning msg="cleaning up after shim disconnected" id=e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530 namespace=k8s.io Jan 29 11:06:10.356038 containerd[1478]: time="2025-01-29T11:06:10.355787812Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 11:06:10.366871 kubelet[2815]: I0129 11:06:10.366805 2815 topology_manager.go:215] "Topology Admit Handler" podUID="7be7e7ab-b82d-4802-9eba-8c9eb76668a3" podNamespace="kube-system" podName="coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:10.370846 kubelet[2815]: I0129 11:06:10.369685 2815 topology_manager.go:215] "Topology Admit Handler" podUID="7a04fac7-1f8b-48e6-9fb1-4421bdb042d6" podNamespace="calico-apiserver" podName="calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:10.375139 kubelet[2815]: I0129 11:06:10.375088 2815 topology_manager.go:215] "Topology Admit Handler" podUID="f133f1ed-4ff6-4186-84a9-0e6e2dda3b55" podNamespace="kube-system" podName="coredns-7db6d8ff4d-mff4v" Jan 29 11:06:10.384166 kubelet[2815]: I0129 11:06:10.383019 2815 topology_manager.go:215] "Topology Admit Handler" podUID="4dcc55b2-baa9-4b75-9e62-2012ad104fe8" podNamespace="calico-system" podName="calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:10.387072 kubelet[2815]: I0129 11:06:10.387038 2815 topology_manager.go:215] "Topology Admit Handler" podUID="03412989-750a-48ed-b795-b7c29a91242d" podNamespace="calico-apiserver" podName="calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:10.387651 systemd[1]: Created slice kubepods-burstable-pod7be7e7ab_b82d_4802_9eba_8c9eb76668a3.slice - libcontainer container kubepods-burstable-pod7be7e7ab_b82d_4802_9eba_8c9eb76668a3.slice. Jan 29 11:06:10.395771 containerd[1478]: time="2025-01-29T11:06:10.393973220Z" level=warning msg="cleanup warnings time=\"2025-01-29T11:06:10Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 29 11:06:10.400963 systemd[1]: Created slice kubepods-besteffort-pod7a04fac7_1f8b_48e6_9fb1_4421bdb042d6.slice - libcontainer container kubepods-besteffort-pod7a04fac7_1f8b_48e6_9fb1_4421bdb042d6.slice. Jan 29 11:06:10.411108 systemd[1]: Created slice kubepods-burstable-podf133f1ed_4ff6_4186_84a9_0e6e2dda3b55.slice - libcontainer container kubepods-burstable-podf133f1ed_4ff6_4186_84a9_0e6e2dda3b55.slice. Jan 29 11:06:10.424938 systemd[1]: Created slice kubepods-besteffort-pod4dcc55b2_baa9_4b75_9e62_2012ad104fe8.slice - libcontainer container kubepods-besteffort-pod4dcc55b2_baa9_4b75_9e62_2012ad104fe8.slice. Jan 29 11:06:10.434110 systemd[1]: Created slice kubepods-besteffort-pod03412989_750a_48ed_b795_b7c29a91242d.slice - libcontainer container kubepods-besteffort-pod03412989_750a_48ed_b795_b7c29a91242d.slice. Jan 29 11:06:10.460579 kubelet[2815]: I0129 11:06:10.460516 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7a04fac7-1f8b-48e6-9fb1-4421bdb042d6-calico-apiserver-certs\") pod \"calico-apiserver-f785fb85f-zvlk7\" (UID: \"7a04fac7-1f8b-48e6-9fb1-4421bdb042d6\") " pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:10.460579 kubelet[2815]: I0129 11:06:10.460570 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjl8c\" (UniqueName: \"kubernetes.io/projected/03412989-750a-48ed-b795-b7c29a91242d-kube-api-access-xjl8c\") pod \"calico-apiserver-f785fb85f-vp8hz\" (UID: \"03412989-750a-48ed-b795-b7c29a91242d\") " pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:10.460579 kubelet[2815]: I0129 11:06:10.460592 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7be7e7ab-b82d-4802-9eba-8c9eb76668a3-config-volume\") pod \"coredns-7db6d8ff4d-dcjg8\" (UID: \"7be7e7ab-b82d-4802-9eba-8c9eb76668a3\") " pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:10.460874 kubelet[2815]: I0129 11:06:10.460617 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dcc55b2-baa9-4b75-9e62-2012ad104fe8-tigera-ca-bundle\") pod \"calico-kube-controllers-d6547ffc8-zlmj5\" (UID: \"4dcc55b2-baa9-4b75-9e62-2012ad104fe8\") " pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:10.460874 kubelet[2815]: I0129 11:06:10.460644 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8z7s\" (UniqueName: \"kubernetes.io/projected/7be7e7ab-b82d-4802-9eba-8c9eb76668a3-kube-api-access-n8z7s\") pod \"coredns-7db6d8ff4d-dcjg8\" (UID: \"7be7e7ab-b82d-4802-9eba-8c9eb76668a3\") " pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:10.460874 kubelet[2815]: I0129 11:06:10.460668 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03412989-750a-48ed-b795-b7c29a91242d-calico-apiserver-certs\") pod \"calico-apiserver-f785fb85f-vp8hz\" (UID: \"03412989-750a-48ed-b795-b7c29a91242d\") " pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:10.460874 kubelet[2815]: I0129 11:06:10.460692 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f133f1ed-4ff6-4186-84a9-0e6e2dda3b55-config-volume\") pod \"coredns-7db6d8ff4d-mff4v\" (UID: \"f133f1ed-4ff6-4186-84a9-0e6e2dda3b55\") " pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:10.460874 kubelet[2815]: I0129 11:06:10.460726 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7n5b\" (UniqueName: \"kubernetes.io/projected/f133f1ed-4ff6-4186-84a9-0e6e2dda3b55-kube-api-access-h7n5b\") pod \"coredns-7db6d8ff4d-mff4v\" (UID: \"f133f1ed-4ff6-4186-84a9-0e6e2dda3b55\") " pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:10.461967 kubelet[2815]: I0129 11:06:10.460748 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqc87\" (UniqueName: \"kubernetes.io/projected/7a04fac7-1f8b-48e6-9fb1-4421bdb042d6-kube-api-access-rqc87\") pod \"calico-apiserver-f785fb85f-zvlk7\" (UID: \"7a04fac7-1f8b-48e6-9fb1-4421bdb042d6\") " pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:10.461967 kubelet[2815]: I0129 11:06:10.460770 2815 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjjj\" (UniqueName: \"kubernetes.io/projected/4dcc55b2-baa9-4b75-9e62-2012ad104fe8-kube-api-access-zmjjj\") pod \"calico-kube-controllers-d6547ffc8-zlmj5\" (UID: \"4dcc55b2-baa9-4b75-9e62-2012ad104fe8\") " pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:10.573843 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e939a4a8db06a9ec0b7b50a9b054362f9ee4eec4895b1eae73fc54057218a530-rootfs.mount: Deactivated successfully. Jan 29 11:06:10.696162 containerd[1478]: time="2025-01-29T11:06:10.695365444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:0,}" Jan 29 11:06:10.708445 containerd[1478]: time="2025-01-29T11:06:10.708115461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:0,}" Jan 29 11:06:10.735846 containerd[1478]: time="2025-01-29T11:06:10.735426655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:0,}" Jan 29 11:06:10.736187 containerd[1478]: time="2025-01-29T11:06:10.736165096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:0,}" Jan 29 11:06:10.739933 containerd[1478]: time="2025-01-29T11:06:10.739825261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:0,}" Jan 29 11:06:10.847371 containerd[1478]: time="2025-01-29T11:06:10.847039398Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 29 11:06:10.944077 containerd[1478]: time="2025-01-29T11:06:10.944026521Z" level=error msg="Failed to destroy network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:10.944757 containerd[1478]: time="2025-01-29T11:06:10.944688402Z" level=error msg="encountered an error cleaning up failed sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:10.944956 containerd[1478]: time="2025-01-29T11:06:10.944925402Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:10.945600 kubelet[2815]: E0129 11:06:10.945254 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:10.945600 kubelet[2815]: E0129 11:06:10.945318 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:10.945600 kubelet[2815]: E0129 11:06:10.945338 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:10.948031 kubelet[2815]: E0129 11:06:10.945375 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" podUID="7a04fac7-1f8b-48e6-9fb1-4421bdb042d6" Jan 29 11:06:11.010373 containerd[1478]: time="2025-01-29T11:06:11.009304323Z" level=error msg="Failed to destroy network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.011250 containerd[1478]: time="2025-01-29T11:06:11.011215766Z" level=error msg="encountered an error cleaning up failed sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.011403 containerd[1478]: time="2025-01-29T11:06:11.011380446Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.011808 kubelet[2815]: E0129 11:06:11.011772 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.013557 kubelet[2815]: E0129 11:06:11.012584 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:11.013557 kubelet[2815]: E0129 11:06:11.012627 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:11.014525 kubelet[2815]: E0129 11:06:11.013896 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" podUID="4dcc55b2-baa9-4b75-9e62-2012ad104fe8" Jan 29 11:06:11.016841 containerd[1478]: time="2025-01-29T11:06:11.016434612Z" level=error msg="Failed to destroy network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.019389 containerd[1478]: time="2025-01-29T11:06:11.018880455Z" level=error msg="encountered an error cleaning up failed sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.019389 containerd[1478]: time="2025-01-29T11:06:11.019137295Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.019654 kubelet[2815]: E0129 11:06:11.019623 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.019884 kubelet[2815]: E0129 11:06:11.019808 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:11.019995 kubelet[2815]: E0129 11:06:11.019973 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:11.020037 containerd[1478]: time="2025-01-29T11:06:11.019942056Z" level=error msg="Failed to destroy network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.020610 kubelet[2815]: E0129 11:06:11.020202 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-dcjg8" podUID="7be7e7ab-b82d-4802-9eba-8c9eb76668a3" Jan 29 11:06:11.020754 containerd[1478]: time="2025-01-29T11:06:11.020331256Z" level=error msg="encountered an error cleaning up failed sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.020754 containerd[1478]: time="2025-01-29T11:06:11.020399336Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.021480 kubelet[2815]: E0129 11:06:11.020923 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.021480 kubelet[2815]: E0129 11:06:11.020959 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:11.021480 kubelet[2815]: E0129 11:06:11.020974 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:11.021620 kubelet[2815]: E0129 11:06:11.021004 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-mff4v" podUID="f133f1ed-4ff6-4186-84a9-0e6e2dda3b55" Jan 29 11:06:11.023991 containerd[1478]: time="2025-01-29T11:06:11.023951100Z" level=error msg="Failed to destroy network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.024398 containerd[1478]: time="2025-01-29T11:06:11.024368701Z" level=error msg="encountered an error cleaning up failed sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.024541 containerd[1478]: time="2025-01-29T11:06:11.024505621Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.024965 kubelet[2815]: E0129 11:06:11.024801 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.024965 kubelet[2815]: E0129 11:06:11.024866 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:11.024965 kubelet[2815]: E0129 11:06:11.024885 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:11.025118 kubelet[2815]: E0129 11:06:11.024919 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" podUID="03412989-750a-48ed-b795-b7c29a91242d" Jan 29 11:06:11.666238 systemd[1]: Created slice kubepods-besteffort-pod66d59454_c196_4ace_a57f_96550c417a39.slice - libcontainer container kubepods-besteffort-pod66d59454_c196_4ace_a57f_96550c417a39.slice. Jan 29 11:06:11.669451 containerd[1478]: time="2025-01-29T11:06:11.669102132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:0,}" Jan 29 11:06:11.737974 containerd[1478]: time="2025-01-29T11:06:11.737913372Z" level=error msg="Failed to destroy network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.739265 containerd[1478]: time="2025-01-29T11:06:11.738669213Z" level=error msg="encountered an error cleaning up failed sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.739265 containerd[1478]: time="2025-01-29T11:06:11.738826693Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.740521 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9-shm.mount: Deactivated successfully. Jan 29 11:06:11.740860 kubelet[2815]: E0129 11:06:11.740644 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:11.743172 kubelet[2815]: E0129 11:06:11.740945 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:11.743172 kubelet[2815]: E0129 11:06:11.740974 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:11.743172 kubelet[2815]: E0129 11:06:11.741034 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:11.847580 kubelet[2815]: I0129 11:06:11.847335 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab" Jan 29 11:06:11.848643 containerd[1478]: time="2025-01-29T11:06:11.848314380Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:11.848643 containerd[1478]: time="2025-01-29T11:06:11.848488140Z" level=info msg="Ensure that sandbox 7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab in task-service has been cleanup successfully" Jan 29 11:06:11.849793 kubelet[2815]: I0129 11:06:11.849404 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73" Jan 29 11:06:11.849911 containerd[1478]: time="2025-01-29T11:06:11.849888862Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:11.849983 containerd[1478]: time="2025-01-29T11:06:11.849970862Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:11.850485 containerd[1478]: time="2025-01-29T11:06:11.850268503Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:11.851207 containerd[1478]: time="2025-01-29T11:06:11.851178304Z" level=info msg="Ensure that sandbox 2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73 in task-service has been cleanup successfully" Jan 29 11:06:11.852327 containerd[1478]: time="2025-01-29T11:06:11.852304785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:1,}" Jan 29 11:06:11.853935 containerd[1478]: time="2025-01-29T11:06:11.853847587Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:11.853935 containerd[1478]: time="2025-01-29T11:06:11.853869227Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:11.854008 systemd[1]: run-netns-cni\x2dbc8e5880\x2d2cc1\x2de92b\x2d5af7\x2dcc99edb451bd.mount: Deactivated successfully. Jan 29 11:06:11.855548 containerd[1478]: time="2025-01-29T11:06:11.855120108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:1,}" Jan 29 11:06:11.856664 kubelet[2815]: I0129 11:06:11.856213 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b" Jan 29 11:06:11.857189 containerd[1478]: time="2025-01-29T11:06:11.857148711Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:11.859343 containerd[1478]: time="2025-01-29T11:06:11.857497311Z" level=info msg="Ensure that sandbox b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b in task-service has been cleanup successfully" Jan 29 11:06:11.860189 containerd[1478]: time="2025-01-29T11:06:11.860156314Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:11.860403 containerd[1478]: time="2025-01-29T11:06:11.860385954Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:11.861851 containerd[1478]: time="2025-01-29T11:06:11.861788476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:1,}" Jan 29 11:06:11.862196 kubelet[2815]: I0129 11:06:11.862164 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9" Jan 29 11:06:11.862679 containerd[1478]: time="2025-01-29T11:06:11.862645397Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:11.862903 containerd[1478]: time="2025-01-29T11:06:11.862877277Z" level=info msg="Ensure that sandbox a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9 in task-service has been cleanup successfully" Jan 29 11:06:11.863102 containerd[1478]: time="2025-01-29T11:06:11.863076117Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:11.863102 containerd[1478]: time="2025-01-29T11:06:11.863096197Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:11.865383 containerd[1478]: time="2025-01-29T11:06:11.864794279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:1,}" Jan 29 11:06:11.866126 kubelet[2815]: I0129 11:06:11.866102 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b" Jan 29 11:06:11.867852 containerd[1478]: time="2025-01-29T11:06:11.867386762Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:11.867852 containerd[1478]: time="2025-01-29T11:06:11.867630523Z" level=info msg="Ensure that sandbox 42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b in task-service has been cleanup successfully" Jan 29 11:06:11.869276 containerd[1478]: time="2025-01-29T11:06:11.869241725Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:11.869276 containerd[1478]: time="2025-01-29T11:06:11.869267405Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:11.869690 kubelet[2815]: I0129 11:06:11.869668 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a" Jan 29 11:06:11.870864 containerd[1478]: time="2025-01-29T11:06:11.870783006Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:11.871552 containerd[1478]: time="2025-01-29T11:06:11.871522087Z" level=info msg="Ensure that sandbox 92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a in task-service has been cleanup successfully" Jan 29 11:06:11.872114 containerd[1478]: time="2025-01-29T11:06:11.871858168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:1,}" Jan 29 11:06:11.872465 containerd[1478]: time="2025-01-29T11:06:11.872430368Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:11.872465 containerd[1478]: time="2025-01-29T11:06:11.872464008Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:11.874272 containerd[1478]: time="2025-01-29T11:06:11.873052409Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:1,}" Jan 29 11:06:12.063079 containerd[1478]: time="2025-01-29T11:06:12.062937784Z" level=error msg="Failed to destroy network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.065485 containerd[1478]: time="2025-01-29T11:06:12.065444626Z" level=error msg="encountered an error cleaning up failed sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.065636 containerd[1478]: time="2025-01-29T11:06:12.065615826Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.068300 kubelet[2815]: E0129 11:06:12.066985 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.068300 kubelet[2815]: E0129 11:06:12.067051 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:12.068300 kubelet[2815]: E0129 11:06:12.067071 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:12.068670 kubelet[2815]: E0129 11:06:12.067113 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" podUID="7a04fac7-1f8b-48e6-9fb1-4421bdb042d6" Jan 29 11:06:12.069252 containerd[1478]: time="2025-01-29T11:06:12.068987510Z" level=error msg="Failed to destroy network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.069331 containerd[1478]: time="2025-01-29T11:06:12.069278750Z" level=error msg="encountered an error cleaning up failed sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.069356 containerd[1478]: time="2025-01-29T11:06:12.069328470Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.069592 kubelet[2815]: E0129 11:06:12.069492 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.069592 kubelet[2815]: E0129 11:06:12.069538 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:12.069592 kubelet[2815]: E0129 11:06:12.069555 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:12.069840 kubelet[2815]: E0129 11:06:12.069585 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" podUID="4dcc55b2-baa9-4b75-9e62-2012ad104fe8" Jan 29 11:06:12.074669 containerd[1478]: time="2025-01-29T11:06:12.074614396Z" level=error msg="Failed to destroy network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.076552 containerd[1478]: time="2025-01-29T11:06:12.076338358Z" level=error msg="encountered an error cleaning up failed sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.077766 containerd[1478]: time="2025-01-29T11:06:12.076752358Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.077869 kubelet[2815]: E0129 11:06:12.077397 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.077869 kubelet[2815]: E0129 11:06:12.077469 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:12.077869 kubelet[2815]: E0129 11:06:12.077576 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:12.078581 kubelet[2815]: E0129 11:06:12.077619 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-dcjg8" podUID="7be7e7ab-b82d-4802-9eba-8c9eb76668a3" Jan 29 11:06:12.084233 containerd[1478]: time="2025-01-29T11:06:12.084190646Z" level=error msg="Failed to destroy network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.084691 containerd[1478]: time="2025-01-29T11:06:12.084532326Z" level=error msg="encountered an error cleaning up failed sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.084691 containerd[1478]: time="2025-01-29T11:06:12.084589366Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.084907 kubelet[2815]: E0129 11:06:12.084801 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.085094 kubelet[2815]: E0129 11:06:12.084924 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:12.085094 kubelet[2815]: E0129 11:06:12.084955 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:12.085094 kubelet[2815]: E0129 11:06:12.085001 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:12.091806 containerd[1478]: time="2025-01-29T11:06:12.091747734Z" level=error msg="Failed to destroy network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.092510 containerd[1478]: time="2025-01-29T11:06:12.092418455Z" level=error msg="encountered an error cleaning up failed sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.092510 containerd[1478]: time="2025-01-29T11:06:12.092520695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.093083 kubelet[2815]: E0129 11:06:12.092757 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.093083 kubelet[2815]: E0129 11:06:12.092837 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:12.093083 kubelet[2815]: E0129 11:06:12.092859 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:12.093208 kubelet[2815]: E0129 11:06:12.092897 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-mff4v" podUID="f133f1ed-4ff6-4186-84a9-0e6e2dda3b55" Jan 29 11:06:12.098896 containerd[1478]: time="2025-01-29T11:06:12.098337941Z" level=error msg="Failed to destroy network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.099447 containerd[1478]: time="2025-01-29T11:06:12.099411942Z" level=error msg="encountered an error cleaning up failed sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.099583 containerd[1478]: time="2025-01-29T11:06:12.099561662Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.099986 kubelet[2815]: E0129 11:06:12.099938 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:12.100077 kubelet[2815]: E0129 11:06:12.100002 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:12.100077 kubelet[2815]: E0129 11:06:12.100033 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:12.100145 kubelet[2815]: E0129 11:06:12.100074 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" podUID="03412989-750a-48ed-b795-b7c29a91242d" Jan 29 11:06:12.579734 systemd[1]: run-netns-cni\x2d62077124\x2dc0f8\x2da1fe\x2df03a\x2dbad9f53f03d1.mount: Deactivated successfully. Jan 29 11:06:12.581436 systemd[1]: run-netns-cni\x2da9d76db6\x2deee4\x2dc033\x2d58f1\x2de3de6b9f81ea.mount: Deactivated successfully. Jan 29 11:06:12.581492 systemd[1]: run-netns-cni\x2d22347911\x2ddb10\x2d0f62\x2dd81c\x2d8c8421fd7610.mount: Deactivated successfully. Jan 29 11:06:12.581537 systemd[1]: run-netns-cni\x2d0d97f027\x2dc611\x2da0af\x2db72e\x2d393b3cfb5425.mount: Deactivated successfully. Jan 29 11:06:12.581579 systemd[1]: run-netns-cni\x2da53ca854\x2d3377\x2dcfa0\x2d1ba4\x2d6dc5d4a95a72.mount: Deactivated successfully. Jan 29 11:06:12.875322 kubelet[2815]: I0129 11:06:12.874370 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce" Jan 29 11:06:12.875427 containerd[1478]: time="2025-01-29T11:06:12.874972963Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:12.875427 containerd[1478]: time="2025-01-29T11:06:12.875135603Z" level=info msg="Ensure that sandbox 30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce in task-service has been cleanup successfully" Jan 29 11:06:12.876386 containerd[1478]: time="2025-01-29T11:06:12.876068444Z" level=info msg="TearDown network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" successfully" Jan 29 11:06:12.876386 containerd[1478]: time="2025-01-29T11:06:12.876089724Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" returns successfully" Jan 29 11:06:12.879266 containerd[1478]: time="2025-01-29T11:06:12.878768727Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:12.879266 containerd[1478]: time="2025-01-29T11:06:12.878868567Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:12.879266 containerd[1478]: time="2025-01-29T11:06:12.878878727Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:12.880329 systemd[1]: run-netns-cni\x2d83c238f2\x2d2c5d\x2d758a\x2d2d6d\x2d51e4eb6b7997.mount: Deactivated successfully. Jan 29 11:06:12.881756 containerd[1478]: time="2025-01-29T11:06:12.881695370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:2,}" Jan 29 11:06:12.884981 kubelet[2815]: I0129 11:06:12.884222 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f" Jan 29 11:06:12.885890 containerd[1478]: time="2025-01-29T11:06:12.885503054Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:12.885890 containerd[1478]: time="2025-01-29T11:06:12.885655574Z" level=info msg="Ensure that sandbox 6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f in task-service has been cleanup successfully" Jan 29 11:06:12.888581 systemd[1]: run-netns-cni\x2d703f51e2\x2d7632\x2df561\x2d4b84\x2deff1253e2905.mount: Deactivated successfully. Jan 29 11:06:12.890898 containerd[1478]: time="2025-01-29T11:06:12.889105698Z" level=info msg="TearDown network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" successfully" Jan 29 11:06:12.890898 containerd[1478]: time="2025-01-29T11:06:12.889142098Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" returns successfully" Jan 29 11:06:12.893167 containerd[1478]: time="2025-01-29T11:06:12.891957621Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:12.893167 containerd[1478]: time="2025-01-29T11:06:12.892068421Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:12.893167 containerd[1478]: time="2025-01-29T11:06:12.892079381Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:12.895458 containerd[1478]: time="2025-01-29T11:06:12.895418224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:2,}" Jan 29 11:06:12.896067 kubelet[2815]: I0129 11:06:12.896028 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1" Jan 29 11:06:12.898660 containerd[1478]: time="2025-01-29T11:06:12.898413068Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:12.900181 containerd[1478]: time="2025-01-29T11:06:12.900129749Z" level=info msg="Ensure that sandbox ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1 in task-service has been cleanup successfully" Jan 29 11:06:12.904263 containerd[1478]: time="2025-01-29T11:06:12.902945952Z" level=info msg="TearDown network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" successfully" Jan 29 11:06:12.904263 containerd[1478]: time="2025-01-29T11:06:12.902981872Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" returns successfully" Jan 29 11:06:12.904785 systemd[1]: run-netns-cni\x2d05401499\x2d8c7e\x2d9f7d\x2d2b36\x2dd0636ebc67f8.mount: Deactivated successfully. Jan 29 11:06:12.905490 kubelet[2815]: I0129 11:06:12.905407 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5" Jan 29 11:06:12.908062 containerd[1478]: time="2025-01-29T11:06:12.907302877Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:12.908062 containerd[1478]: time="2025-01-29T11:06:12.907498357Z" level=info msg="Ensure that sandbox 10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5 in task-service has been cleanup successfully" Jan 29 11:06:12.909843 containerd[1478]: time="2025-01-29T11:06:12.908662078Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:12.909843 containerd[1478]: time="2025-01-29T11:06:12.908781398Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:12.909843 containerd[1478]: time="2025-01-29T11:06:12.908792119Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:12.913129 containerd[1478]: time="2025-01-29T11:06:12.912939283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:2,}" Jan 29 11:06:12.913851 systemd[1]: run-netns-cni\x2d5468463f\x2d8eae\x2d2118\x2d2310\x2de8a2fe424307.mount: Deactivated successfully. Jan 29 11:06:12.915495 containerd[1478]: time="2025-01-29T11:06:12.914151964Z" level=info msg="TearDown network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" successfully" Jan 29 11:06:12.915495 containerd[1478]: time="2025-01-29T11:06:12.915431326Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" returns successfully" Jan 29 11:06:12.917598 containerd[1478]: time="2025-01-29T11:06:12.917561448Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:12.919050 containerd[1478]: time="2025-01-29T11:06:12.918516409Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:12.919050 containerd[1478]: time="2025-01-29T11:06:12.918540969Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:12.919415 kubelet[2815]: I0129 11:06:12.919392 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4" Jan 29 11:06:12.919989 containerd[1478]: time="2025-01-29T11:06:12.919952490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:2,}" Jan 29 11:06:12.921166 containerd[1478]: time="2025-01-29T11:06:12.921135372Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:12.924464 kubelet[2815]: I0129 11:06:12.924440 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1" Jan 29 11:06:12.925481 containerd[1478]: time="2025-01-29T11:06:12.925444976Z" level=info msg="Ensure that sandbox e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4 in task-service has been cleanup successfully" Jan 29 11:06:12.926113 containerd[1478]: time="2025-01-29T11:06:12.926076537Z" level=info msg="TearDown network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" successfully" Jan 29 11:06:12.926190 containerd[1478]: time="2025-01-29T11:06:12.926141417Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" returns successfully" Jan 29 11:06:12.926664 containerd[1478]: time="2025-01-29T11:06:12.926611097Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:12.928341 containerd[1478]: time="2025-01-29T11:06:12.928136299Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:12.928775 containerd[1478]: time="2025-01-29T11:06:12.928565739Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:12.929014 containerd[1478]: time="2025-01-29T11:06:12.928980900Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:12.929083 containerd[1478]: time="2025-01-29T11:06:12.928970500Z" level=info msg="Ensure that sandbox 7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1 in task-service has been cleanup successfully" Jan 29 11:06:12.930079 containerd[1478]: time="2025-01-29T11:06:12.930045661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:2,}" Jan 29 11:06:12.931568 containerd[1478]: time="2025-01-29T11:06:12.931525263Z" level=info msg="TearDown network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" successfully" Jan 29 11:06:12.931568 containerd[1478]: time="2025-01-29T11:06:12.931563303Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" returns successfully" Jan 29 11:06:12.932769 containerd[1478]: time="2025-01-29T11:06:12.932566584Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:12.932769 containerd[1478]: time="2025-01-29T11:06:12.932680744Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:12.932769 containerd[1478]: time="2025-01-29T11:06:12.932690824Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:12.934837 containerd[1478]: time="2025-01-29T11:06:12.934659066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:2,}" Jan 29 11:06:13.059272 containerd[1478]: time="2025-01-29T11:06:13.059205832Z" level=error msg="Failed to destroy network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.062985 containerd[1478]: time="2025-01-29T11:06:13.062937915Z" level=error msg="encountered an error cleaning up failed sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.064503 containerd[1478]: time="2025-01-29T11:06:13.064157756Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.066440 kubelet[2815]: E0129 11:06:13.066378 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.066522 kubelet[2815]: E0129 11:06:13.066442 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:13.066522 kubelet[2815]: E0129 11:06:13.066463 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:13.066522 kubelet[2815]: E0129 11:06:13.066503 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:13.108157 containerd[1478]: time="2025-01-29T11:06:13.107649278Z" level=error msg="Failed to destroy network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.110267 containerd[1478]: time="2025-01-29T11:06:13.110209920Z" level=error msg="encountered an error cleaning up failed sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.110957 containerd[1478]: time="2025-01-29T11:06:13.110588961Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.112027 kubelet[2815]: E0129 11:06:13.111471 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.112027 kubelet[2815]: E0129 11:06:13.111653 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:13.112027 kubelet[2815]: E0129 11:06:13.111680 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:13.112524 kubelet[2815]: E0129 11:06:13.111736 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" podUID="4dcc55b2-baa9-4b75-9e62-2012ad104fe8" Jan 29 11:06:13.166940 containerd[1478]: time="2025-01-29T11:06:13.164893773Z" level=error msg="Failed to destroy network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.166940 containerd[1478]: time="2025-01-29T11:06:13.165264613Z" level=error msg="encountered an error cleaning up failed sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.166940 containerd[1478]: time="2025-01-29T11:06:13.165322293Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.167123 kubelet[2815]: E0129 11:06:13.166343 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.167123 kubelet[2815]: E0129 11:06:13.166405 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:13.167123 kubelet[2815]: E0129 11:06:13.166424 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:13.167208 kubelet[2815]: E0129 11:06:13.166464 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" podUID="7a04fac7-1f8b-48e6-9fb1-4421bdb042d6" Jan 29 11:06:13.185598 containerd[1478]: time="2025-01-29T11:06:13.184445791Z" level=error msg="Failed to destroy network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.185598 containerd[1478]: time="2025-01-29T11:06:13.185374072Z" level=error msg="encountered an error cleaning up failed sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.185598 containerd[1478]: time="2025-01-29T11:06:13.185448272Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.185992 kubelet[2815]: E0129 11:06:13.185922 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.186069 kubelet[2815]: E0129 11:06:13.186030 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:13.186069 kubelet[2815]: E0129 11:06:13.186050 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:13.186275 kubelet[2815]: E0129 11:06:13.186112 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-dcjg8" podUID="7be7e7ab-b82d-4802-9eba-8c9eb76668a3" Jan 29 11:06:13.197203 containerd[1478]: time="2025-01-29T11:06:13.197161483Z" level=error msg="Failed to destroy network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.197654 containerd[1478]: time="2025-01-29T11:06:13.197629484Z" level=error msg="encountered an error cleaning up failed sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.197806 containerd[1478]: time="2025-01-29T11:06:13.197784284Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.198565 kubelet[2815]: E0129 11:06:13.198230 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.198565 kubelet[2815]: E0129 11:06:13.198289 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:13.198565 kubelet[2815]: E0129 11:06:13.198307 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:13.198738 kubelet[2815]: E0129 11:06:13.198349 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-mff4v" podUID="f133f1ed-4ff6-4186-84a9-0e6e2dda3b55" Jan 29 11:06:13.214011 containerd[1478]: time="2025-01-29T11:06:13.213963659Z" level=error msg="Failed to destroy network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.214584 containerd[1478]: time="2025-01-29T11:06:13.214557140Z" level=error msg="encountered an error cleaning up failed sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.214726 containerd[1478]: time="2025-01-29T11:06:13.214686100Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.215058 kubelet[2815]: E0129 11:06:13.215021 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:13.215202 kubelet[2815]: E0129 11:06:13.215181 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:13.215271 kubelet[2815]: E0129 11:06:13.215257 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:13.215381 kubelet[2815]: E0129 11:06:13.215357 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" podUID="03412989-750a-48ed-b795-b7c29a91242d" Jan 29 11:06:13.578305 systemd[1]: run-netns-cni\x2d9611f0fd\x2da90a\x2d91e4\x2d5969\x2d938e370d6d7b.mount: Deactivated successfully. Jan 29 11:06:13.578632 systemd[1]: run-netns-cni\x2d0d1b3da9\x2dc674\x2d7c00\x2dd4e0\x2dcdd1d2c2af27.mount: Deactivated successfully. Jan 29 11:06:13.930246 kubelet[2815]: I0129 11:06:13.930205 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c" Jan 29 11:06:13.934116 containerd[1478]: time="2025-01-29T11:06:13.931228545Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" Jan 29 11:06:13.934116 containerd[1478]: time="2025-01-29T11:06:13.931412825Z" level=info msg="Ensure that sandbox 8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c in task-service has been cleanup successfully" Jan 29 11:06:13.933737 systemd[1]: run-netns-cni\x2dce8545d1\x2d5df8\x2d4d48\x2ddf94\x2dd6e6f2723bef.mount: Deactivated successfully. Jan 29 11:06:13.935787 kubelet[2815]: I0129 11:06:13.935447 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840" Jan 29 11:06:13.936252 containerd[1478]: time="2025-01-29T11:06:13.936220429Z" level=info msg="TearDown network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" successfully" Jan 29 11:06:13.937165 containerd[1478]: time="2025-01-29T11:06:13.937131110Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" returns successfully" Jan 29 11:06:13.939440 containerd[1478]: time="2025-01-29T11:06:13.936501750Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" Jan 29 11:06:13.939440 containerd[1478]: time="2025-01-29T11:06:13.939174472Z" level=info msg="Ensure that sandbox 2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840 in task-service has been cleanup successfully" Jan 29 11:06:13.941303 containerd[1478]: time="2025-01-29T11:06:13.941066874Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:13.941303 containerd[1478]: time="2025-01-29T11:06:13.941160154Z" level=info msg="TearDown network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" successfully" Jan 29 11:06:13.941303 containerd[1478]: time="2025-01-29T11:06:13.941170114Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" returns successfully" Jan 29 11:06:13.943746 systemd[1]: run-netns-cni\x2d52e0161d\x2de3dd\x2da65e\x2d2fd5\x2de0f9daf90c8c.mount: Deactivated successfully. Jan 29 11:06:13.944481 containerd[1478]: time="2025-01-29T11:06:13.944453877Z" level=info msg="TearDown network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" successfully" Jan 29 11:06:13.945549 containerd[1478]: time="2025-01-29T11:06:13.944596277Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" returns successfully" Jan 29 11:06:13.945549 containerd[1478]: time="2025-01-29T11:06:13.945117118Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:13.945549 containerd[1478]: time="2025-01-29T11:06:13.945291158Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:13.945549 containerd[1478]: time="2025-01-29T11:06:13.945304518Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:13.945718 kubelet[2815]: I0129 11:06:13.945656 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d" Jan 29 11:06:13.947970 containerd[1478]: time="2025-01-29T11:06:13.946682519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:3,}" Jan 29 11:06:13.947970 containerd[1478]: time="2025-01-29T11:06:13.947364800Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:13.947970 containerd[1478]: time="2025-01-29T11:06:13.947461560Z" level=info msg="TearDown network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" successfully" Jan 29 11:06:13.947970 containerd[1478]: time="2025-01-29T11:06:13.947470480Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" returns successfully" Jan 29 11:06:13.947970 containerd[1478]: time="2025-01-29T11:06:13.947672360Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" Jan 29 11:06:13.948384 containerd[1478]: time="2025-01-29T11:06:13.948356601Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:13.948950 containerd[1478]: time="2025-01-29T11:06:13.948588481Z" level=info msg="Ensure that sandbox 059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d in task-service has been cleanup successfully" Jan 29 11:06:13.949342 containerd[1478]: time="2025-01-29T11:06:13.948636721Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:13.949342 containerd[1478]: time="2025-01-29T11:06:13.949335322Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:13.954125 containerd[1478]: time="2025-01-29T11:06:13.953062565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:3,}" Jan 29 11:06:13.953650 systemd[1]: run-netns-cni\x2d6e46c2a8\x2dc186\x2d4b77\x2d271f\x2db79a2e8dc627.mount: Deactivated successfully. Jan 29 11:06:13.954782 containerd[1478]: time="2025-01-29T11:06:13.954628807Z" level=info msg="TearDown network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" successfully" Jan 29 11:06:13.954782 containerd[1478]: time="2025-01-29T11:06:13.954658727Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" returns successfully" Jan 29 11:06:13.958231 containerd[1478]: time="2025-01-29T11:06:13.958190370Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:13.958596 containerd[1478]: time="2025-01-29T11:06:13.958326810Z" level=info msg="TearDown network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" successfully" Jan 29 11:06:13.958596 containerd[1478]: time="2025-01-29T11:06:13.958342370Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" returns successfully" Jan 29 11:06:13.960033 kubelet[2815]: I0129 11:06:13.959020 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9" Jan 29 11:06:13.960120 containerd[1478]: time="2025-01-29T11:06:13.959511572Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:13.960120 containerd[1478]: time="2025-01-29T11:06:13.959593052Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:13.960120 containerd[1478]: time="2025-01-29T11:06:13.959602332Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:13.961650 containerd[1478]: time="2025-01-29T11:06:13.961138133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:3,}" Jan 29 11:06:13.961650 containerd[1478]: time="2025-01-29T11:06:13.961631174Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" Jan 29 11:06:13.962352 containerd[1478]: time="2025-01-29T11:06:13.962217294Z" level=info msg="Ensure that sandbox 09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9 in task-service has been cleanup successfully" Jan 29 11:06:13.964549 systemd[1]: run-netns-cni\x2ddd83e1de\x2ddc28\x2d29b2\x2d8fa5\x2dd761d10027f2.mount: Deactivated successfully. Jan 29 11:06:13.967898 kubelet[2815]: I0129 11:06:13.967379 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414" Jan 29 11:06:13.970206 containerd[1478]: time="2025-01-29T11:06:13.970168342Z" level=info msg="TearDown network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" successfully" Jan 29 11:06:13.970296 containerd[1478]: time="2025-01-29T11:06:13.970282462Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" returns successfully" Jan 29 11:06:13.971994 containerd[1478]: time="2025-01-29T11:06:13.971837783Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" Jan 29 11:06:13.973577 containerd[1478]: time="2025-01-29T11:06:13.973002904Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:13.973577 containerd[1478]: time="2025-01-29T11:06:13.973532785Z" level=info msg="TearDown network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" successfully" Jan 29 11:06:13.973577 containerd[1478]: time="2025-01-29T11:06:13.973542985Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" returns successfully" Jan 29 11:06:13.974081 containerd[1478]: time="2025-01-29T11:06:13.973979625Z" level=info msg="Ensure that sandbox 6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414 in task-service has been cleanup successfully" Jan 29 11:06:13.974647 containerd[1478]: time="2025-01-29T11:06:13.974550666Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:13.975544 kubelet[2815]: I0129 11:06:13.975366 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4" Jan 29 11:06:13.975925 containerd[1478]: time="2025-01-29T11:06:13.975157987Z" level=info msg="TearDown network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" successfully" Jan 29 11:06:13.976097 containerd[1478]: time="2025-01-29T11:06:13.975995707Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" returns successfully" Jan 29 11:06:13.976754 containerd[1478]: time="2025-01-29T11:06:13.975987987Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:13.976754 containerd[1478]: time="2025-01-29T11:06:13.976334148Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:13.976754 containerd[1478]: time="2025-01-29T11:06:13.976618348Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" Jan 29 11:06:13.977259 containerd[1478]: time="2025-01-29T11:06:13.976904548Z" level=info msg="Ensure that sandbox ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4 in task-service has been cleanup successfully" Jan 29 11:06:13.977259 containerd[1478]: time="2025-01-29T11:06:13.977077668Z" level=info msg="TearDown network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" successfully" Jan 29 11:06:13.977259 containerd[1478]: time="2025-01-29T11:06:13.977090388Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" returns successfully" Jan 29 11:06:13.977897 containerd[1478]: time="2025-01-29T11:06:13.977863869Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:13.978080 containerd[1478]: time="2025-01-29T11:06:13.978045429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:3,}" Jan 29 11:06:13.978658 containerd[1478]: time="2025-01-29T11:06:13.978622630Z" level=info msg="TearDown network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" successfully" Jan 29 11:06:13.979002 containerd[1478]: time="2025-01-29T11:06:13.978954950Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" returns successfully" Jan 29 11:06:13.979677 containerd[1478]: time="2025-01-29T11:06:13.977971109Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:13.979677 containerd[1478]: time="2025-01-29T11:06:13.979497551Z" level=info msg="TearDown network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" successfully" Jan 29 11:06:13.979677 containerd[1478]: time="2025-01-29T11:06:13.979510431Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" returns successfully" Jan 29 11:06:13.980237 containerd[1478]: time="2025-01-29T11:06:13.980022311Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:13.980237 containerd[1478]: time="2025-01-29T11:06:13.980126911Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:13.980237 containerd[1478]: time="2025-01-29T11:06:13.980137671Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:13.980237 containerd[1478]: time="2025-01-29T11:06:13.980232231Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:13.980574 containerd[1478]: time="2025-01-29T11:06:13.980293631Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:13.980574 containerd[1478]: time="2025-01-29T11:06:13.980303631Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:13.981477 containerd[1478]: time="2025-01-29T11:06:13.981446953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:3,}" Jan 29 11:06:13.983049 containerd[1478]: time="2025-01-29T11:06:13.982808234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:3,}" Jan 29 11:06:14.189911 containerd[1478]: time="2025-01-29T11:06:14.189756773Z" level=error msg="Failed to destroy network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.198888 containerd[1478]: time="2025-01-29T11:06:14.198750500Z" level=error msg="encountered an error cleaning up failed sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.199018 containerd[1478]: time="2025-01-29T11:06:14.198916780Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.199517 kubelet[2815]: E0129 11:06:14.199314 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.199517 kubelet[2815]: E0129 11:06:14.199464 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:14.200191 kubelet[2815]: E0129 11:06:14.199489 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:14.200191 kubelet[2815]: E0129 11:06:14.199884 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" podUID="4dcc55b2-baa9-4b75-9e62-2012ad104fe8" Jan 29 11:06:14.244802 containerd[1478]: time="2025-01-29T11:06:14.244212819Z" level=error msg="Failed to destroy network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.244802 containerd[1478]: time="2025-01-29T11:06:14.244630860Z" level=error msg="encountered an error cleaning up failed sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.244802 containerd[1478]: time="2025-01-29T11:06:14.244735060Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.245045 kubelet[2815]: E0129 11:06:14.244965 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.245045 kubelet[2815]: E0129 11:06:14.245018 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:14.245108 kubelet[2815]: E0129 11:06:14.245043 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:14.245108 kubelet[2815]: E0129 11:06:14.245092 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" podUID="7a04fac7-1f8b-48e6-9fb1-4421bdb042d6" Jan 29 11:06:14.246290 containerd[1478]: time="2025-01-29T11:06:14.245837581Z" level=error msg="Failed to destroy network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.246290 containerd[1478]: time="2025-01-29T11:06:14.246146181Z" level=error msg="encountered an error cleaning up failed sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.246290 containerd[1478]: time="2025-01-29T11:06:14.246201141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.246756 kubelet[2815]: E0129 11:06:14.246532 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.246756 kubelet[2815]: E0129 11:06:14.246621 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:14.246756 kubelet[2815]: E0129 11:06:14.246655 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:14.246904 kubelet[2815]: E0129 11:06:14.246716 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-mff4v" podUID="f133f1ed-4ff6-4186-84a9-0e6e2dda3b55" Jan 29 11:06:14.260969 containerd[1478]: time="2025-01-29T11:06:14.260794873Z" level=error msg="Failed to destroy network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.262905 containerd[1478]: time="2025-01-29T11:06:14.262747795Z" level=error msg="encountered an error cleaning up failed sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.263105 containerd[1478]: time="2025-01-29T11:06:14.262969515Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.263626 containerd[1478]: time="2025-01-29T11:06:14.263247316Z" level=error msg="Failed to destroy network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.263626 containerd[1478]: time="2025-01-29T11:06:14.263517956Z" level=error msg="encountered an error cleaning up failed sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.263729 kubelet[2815]: E0129 11:06:14.263225 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.263729 kubelet[2815]: E0129 11:06:14.263281 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:14.263729 kubelet[2815]: E0129 11:06:14.263345 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:14.264042 kubelet[2815]: E0129 11:06:14.263393 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-dcjg8" podUID="7be7e7ab-b82d-4802-9eba-8c9eb76668a3" Jan 29 11:06:14.264226 containerd[1478]: time="2025-01-29T11:06:14.263934076Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.264863 kubelet[2815]: E0129 11:06:14.264433 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.264863 kubelet[2815]: E0129 11:06:14.264470 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:14.264863 kubelet[2815]: E0129 11:06:14.264489 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:14.265068 kubelet[2815]: E0129 11:06:14.264519 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:14.265669 containerd[1478]: time="2025-01-29T11:06:14.265628638Z" level=error msg="Failed to destroy network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.266187 containerd[1478]: time="2025-01-29T11:06:14.266144638Z" level=error msg="encountered an error cleaning up failed sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.266537 containerd[1478]: time="2025-01-29T11:06:14.266473158Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.266750 kubelet[2815]: E0129 11:06:14.266651 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:14.266750 kubelet[2815]: E0129 11:06:14.266708 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:14.266750 kubelet[2815]: E0129 11:06:14.266726 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:14.266952 kubelet[2815]: E0129 11:06:14.266756 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" podUID="03412989-750a-48ed-b795-b7c29a91242d" Jan 29 11:06:14.579661 systemd[1]: run-netns-cni\x2dff4ee886\x2d4711\x2d22af\x2d7e24\x2d504c0eaf807a.mount: Deactivated successfully. Jan 29 11:06:14.579805 systemd[1]: run-netns-cni\x2dd86ef583\x2db32a\x2d5cd8\x2dc419\x2d187c1a2ece5b.mount: Deactivated successfully. Jan 29 11:06:14.985977 kubelet[2815]: I0129 11:06:14.985704 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01" Jan 29 11:06:14.987831 containerd[1478]: time="2025-01-29T11:06:14.987710415Z" level=info msg="StopPodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\"" Jan 29 11:06:14.990462 containerd[1478]: time="2025-01-29T11:06:14.987888615Z" level=info msg="Ensure that sandbox 1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01 in task-service has been cleanup successfully" Jan 29 11:06:14.991734 systemd[1]: run-netns-cni\x2dd5fbed77\x2dacbd\x2d4aea\x2d340a\x2dc1890b1c1c30.mount: Deactivated successfully. Jan 29 11:06:14.992631 containerd[1478]: time="2025-01-29T11:06:14.991917259Z" level=info msg="TearDown network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" successfully" Jan 29 11:06:14.993574 containerd[1478]: time="2025-01-29T11:06:14.992305539Z" level=info msg="StopPodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" returns successfully" Jan 29 11:06:14.995241 containerd[1478]: time="2025-01-29T11:06:14.995049662Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" Jan 29 11:06:14.995936 containerd[1478]: time="2025-01-29T11:06:14.995906462Z" level=info msg="TearDown network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" successfully" Jan 29 11:06:14.995936 containerd[1478]: time="2025-01-29T11:06:14.995930342Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" returns successfully" Jan 29 11:06:14.997270 containerd[1478]: time="2025-01-29T11:06:14.996891783Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:14.997270 containerd[1478]: time="2025-01-29T11:06:14.996983623Z" level=info msg="TearDown network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" successfully" Jan 29 11:06:14.997270 containerd[1478]: time="2025-01-29T11:06:14.996994983Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" returns successfully" Jan 29 11:06:14.997412 containerd[1478]: time="2025-01-29T11:06:14.997382424Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:14.997483 containerd[1478]: time="2025-01-29T11:06:14.997467304Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:14.997523 containerd[1478]: time="2025-01-29T11:06:14.997481584Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:14.997592 kubelet[2815]: I0129 11:06:14.997563 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8" Jan 29 11:06:14.998880 containerd[1478]: time="2025-01-29T11:06:14.998510625Z" level=info msg="StopPodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\"" Jan 29 11:06:14.999252 containerd[1478]: time="2025-01-29T11:06:14.999215985Z" level=info msg="Ensure that sandbox fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8 in task-service has been cleanup successfully" Jan 29 11:06:15.001123 containerd[1478]: time="2025-01-29T11:06:14.998628585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:4,}" Jan 29 11:06:15.002873 containerd[1478]: time="2025-01-29T11:06:15.001968627Z" level=info msg="TearDown network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" successfully" Jan 29 11:06:15.002873 containerd[1478]: time="2025-01-29T11:06:15.002016307Z" level=info msg="StopPodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" returns successfully" Jan 29 11:06:15.003656 systemd[1]: run-netns-cni\x2dcb81ec21\x2d60f7\x2d13b2\x2d55d4\x2d1df83cc6c39d.mount: Deactivated successfully. Jan 29 11:06:15.005841 containerd[1478]: time="2025-01-29T11:06:15.005787350Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" Jan 29 11:06:15.007358 containerd[1478]: time="2025-01-29T11:06:15.007332631Z" level=info msg="TearDown network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" successfully" Jan 29 11:06:15.007607 kubelet[2815]: I0129 11:06:15.007493 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768" Jan 29 11:06:15.008913 containerd[1478]: time="2025-01-29T11:06:15.007673592Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" returns successfully" Jan 29 11:06:15.009773 containerd[1478]: time="2025-01-29T11:06:15.009747353Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:15.011021 containerd[1478]: time="2025-01-29T11:06:15.010922234Z" level=info msg="TearDown network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" successfully" Jan 29 11:06:15.011021 containerd[1478]: time="2025-01-29T11:06:15.010953394Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" returns successfully" Jan 29 11:06:15.011225 containerd[1478]: time="2025-01-29T11:06:15.011014834Z" level=info msg="StopPodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\"" Jan 29 11:06:15.011225 containerd[1478]: time="2025-01-29T11:06:15.011183954Z" level=info msg="Ensure that sandbox 8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768 in task-service has been cleanup successfully" Jan 29 11:06:15.013611 containerd[1478]: time="2025-01-29T11:06:15.012699435Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:15.013611 containerd[1478]: time="2025-01-29T11:06:15.012790476Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:15.013611 containerd[1478]: time="2025-01-29T11:06:15.012800196Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:15.014510 systemd[1]: run-netns-cni\x2d8f9e7a9e\x2d302d\x2d1c50\x2dfdf5\x2d1485108356b2.mount: Deactivated successfully. Jan 29 11:06:15.016595 containerd[1478]: time="2025-01-29T11:06:15.016559838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:4,}" Jan 29 11:06:15.017456 containerd[1478]: time="2025-01-29T11:06:15.017426919Z" level=info msg="TearDown network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" successfully" Jan 29 11:06:15.017456 containerd[1478]: time="2025-01-29T11:06:15.017455399Z" level=info msg="StopPodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" returns successfully" Jan 29 11:06:15.018227 containerd[1478]: time="2025-01-29T11:06:15.018127320Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" Jan 29 11:06:15.018562 containerd[1478]: time="2025-01-29T11:06:15.018339880Z" level=info msg="TearDown network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" successfully" Jan 29 11:06:15.018562 containerd[1478]: time="2025-01-29T11:06:15.018356200Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" returns successfully" Jan 29 11:06:15.020312 containerd[1478]: time="2025-01-29T11:06:15.020274681Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:15.020374 containerd[1478]: time="2025-01-29T11:06:15.020361961Z" level=info msg="TearDown network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" successfully" Jan 29 11:06:15.020420 containerd[1478]: time="2025-01-29T11:06:15.020371241Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" returns successfully" Jan 29 11:06:15.021310 kubelet[2815]: I0129 11:06:15.020510 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63" Jan 29 11:06:15.021677 containerd[1478]: time="2025-01-29T11:06:15.021529682Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:15.021677 containerd[1478]: time="2025-01-29T11:06:15.021617282Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:15.021677 containerd[1478]: time="2025-01-29T11:06:15.021627082Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:15.021677 containerd[1478]: time="2025-01-29T11:06:15.021632602Z" level=info msg="StopPodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\"" Jan 29 11:06:15.021842 containerd[1478]: time="2025-01-29T11:06:15.021798442Z" level=info msg="Ensure that sandbox 88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63 in task-service has been cleanup successfully" Jan 29 11:06:15.024977 containerd[1478]: time="2025-01-29T11:06:15.024942525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:4,}" Jan 29 11:06:15.025779 systemd[1]: run-netns-cni\x2d9f74a98b\x2d2b46\x2d9864\x2d67af\x2d8258602145ab.mount: Deactivated successfully. Jan 29 11:06:15.028046 containerd[1478]: time="2025-01-29T11:06:15.026562726Z" level=info msg="TearDown network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" successfully" Jan 29 11:06:15.028046 containerd[1478]: time="2025-01-29T11:06:15.027386327Z" level=info msg="StopPodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" returns successfully" Jan 29 11:06:15.028226 containerd[1478]: time="2025-01-29T11:06:15.028194487Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" Jan 29 11:06:15.028342 containerd[1478]: time="2025-01-29T11:06:15.028284047Z" level=info msg="TearDown network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" successfully" Jan 29 11:06:15.028342 containerd[1478]: time="2025-01-29T11:06:15.028297407Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" returns successfully" Jan 29 11:06:15.028935 containerd[1478]: time="2025-01-29T11:06:15.028796808Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:15.029209 containerd[1478]: time="2025-01-29T11:06:15.029126768Z" level=info msg="TearDown network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" successfully" Jan 29 11:06:15.029209 containerd[1478]: time="2025-01-29T11:06:15.029144768Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" returns successfully" Jan 29 11:06:15.031336 containerd[1478]: time="2025-01-29T11:06:15.030759049Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:15.031336 containerd[1478]: time="2025-01-29T11:06:15.030918249Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:15.031336 containerd[1478]: time="2025-01-29T11:06:15.030931049Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:15.032110 containerd[1478]: time="2025-01-29T11:06:15.031594890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:4,}" Jan 29 11:06:15.032477 kubelet[2815]: I0129 11:06:15.032453 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7" Jan 29 11:06:15.034891 containerd[1478]: time="2025-01-29T11:06:15.034808332Z" level=info msg="StopPodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\"" Jan 29 11:06:15.035047 containerd[1478]: time="2025-01-29T11:06:15.035012452Z" level=info msg="Ensure that sandbox a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7 in task-service has been cleanup successfully" Jan 29 11:06:15.036807 containerd[1478]: time="2025-01-29T11:06:15.036611854Z" level=info msg="TearDown network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" successfully" Jan 29 11:06:15.036807 containerd[1478]: time="2025-01-29T11:06:15.036642814Z" level=info msg="StopPodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" returns successfully" Jan 29 11:06:15.039467 containerd[1478]: time="2025-01-29T11:06:15.039142696Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" Jan 29 11:06:15.039467 containerd[1478]: time="2025-01-29T11:06:15.039272136Z" level=info msg="TearDown network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" successfully" Jan 29 11:06:15.039467 containerd[1478]: time="2025-01-29T11:06:15.039283256Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" returns successfully" Jan 29 11:06:15.040366 containerd[1478]: time="2025-01-29T11:06:15.040335616Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:15.040552 containerd[1478]: time="2025-01-29T11:06:15.040418097Z" level=info msg="TearDown network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" successfully" Jan 29 11:06:15.040552 containerd[1478]: time="2025-01-29T11:06:15.040468537Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" returns successfully" Jan 29 11:06:15.041598 containerd[1478]: time="2025-01-29T11:06:15.041521017Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:15.041968 containerd[1478]: time="2025-01-29T11:06:15.041917978Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:15.041968 containerd[1478]: time="2025-01-29T11:06:15.041945458Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:15.043301 containerd[1478]: time="2025-01-29T11:06:15.042986698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:4,}" Jan 29 11:06:15.043592 kubelet[2815]: I0129 11:06:15.043533 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b" Jan 29 11:06:15.044238 containerd[1478]: time="2025-01-29T11:06:15.044211859Z" level=info msg="StopPodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\"" Jan 29 11:06:15.044382 containerd[1478]: time="2025-01-29T11:06:15.044361500Z" level=info msg="Ensure that sandbox 1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b in task-service has been cleanup successfully" Jan 29 11:06:15.046416 containerd[1478]: time="2025-01-29T11:06:15.046181021Z" level=info msg="TearDown network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" successfully" Jan 29 11:06:15.046416 containerd[1478]: time="2025-01-29T11:06:15.046213861Z" level=info msg="StopPodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" returns successfully" Jan 29 11:06:15.047583 containerd[1478]: time="2025-01-29T11:06:15.047257022Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" Jan 29 11:06:15.047583 containerd[1478]: time="2025-01-29T11:06:15.047409542Z" level=info msg="TearDown network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" successfully" Jan 29 11:06:15.047583 containerd[1478]: time="2025-01-29T11:06:15.047429662Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" returns successfully" Jan 29 11:06:15.049450 containerd[1478]: time="2025-01-29T11:06:15.049102223Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:15.049949 containerd[1478]: time="2025-01-29T11:06:15.049760504Z" level=info msg="TearDown network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" successfully" Jan 29 11:06:15.052761 containerd[1478]: time="2025-01-29T11:06:15.052632906Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" returns successfully" Jan 29 11:06:15.054267 containerd[1478]: time="2025-01-29T11:06:15.054101627Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:15.054365 containerd[1478]: time="2025-01-29T11:06:15.054331067Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:15.054365 containerd[1478]: time="2025-01-29T11:06:15.054350347Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:15.056488 containerd[1478]: time="2025-01-29T11:06:15.056265109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:4,}" Jan 29 11:06:15.232510 containerd[1478]: time="2025-01-29T11:06:15.232467362Z" level=error msg="Failed to destroy network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.233535 containerd[1478]: time="2025-01-29T11:06:15.233497603Z" level=error msg="encountered an error cleaning up failed sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.235574 containerd[1478]: time="2025-01-29T11:06:15.235117884Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.236454 kubelet[2815]: E0129 11:06:15.235947 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.236454 kubelet[2815]: E0129 11:06:15.236011 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:15.236454 kubelet[2815]: E0129 11:06:15.236034 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" Jan 29 11:06:15.236800 kubelet[2815]: E0129 11:06:15.236089 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d6547ffc8-zlmj5_calico-system(4dcc55b2-baa9-4b75-9e62-2012ad104fe8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" podUID="4dcc55b2-baa9-4b75-9e62-2012ad104fe8" Jan 29 11:06:15.254487 containerd[1478]: time="2025-01-29T11:06:15.254110379Z" level=error msg="Failed to destroy network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.255405 containerd[1478]: time="2025-01-29T11:06:15.255373700Z" level=error msg="encountered an error cleaning up failed sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.256076 containerd[1478]: time="2025-01-29T11:06:15.255636300Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.257157 kubelet[2815]: E0129 11:06:15.256799 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.257157 kubelet[2815]: E0129 11:06:15.256865 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:15.257157 kubelet[2815]: E0129 11:06:15.256886 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" Jan 29 11:06:15.257310 kubelet[2815]: E0129 11:06:15.256920 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-zvlk7_calico-apiserver(7a04fac7-1f8b-48e6-9fb1-4421bdb042d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" podUID="7a04fac7-1f8b-48e6-9fb1-4421bdb042d6" Jan 29 11:06:15.295680 containerd[1478]: time="2025-01-29T11:06:15.295510050Z" level=error msg="Failed to destroy network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.296618 containerd[1478]: time="2025-01-29T11:06:15.295892610Z" level=error msg="encountered an error cleaning up failed sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.296618 containerd[1478]: time="2025-01-29T11:06:15.295959170Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.296761 kubelet[2815]: E0129 11:06:15.296155 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.296761 kubelet[2815]: E0129 11:06:15.296236 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:15.296761 kubelet[2815]: E0129 11:06:15.296270 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-dcjg8" Jan 29 11:06:15.296863 kubelet[2815]: E0129 11:06:15.296304 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-dcjg8_kube-system(7be7e7ab-b82d-4802-9eba-8c9eb76668a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-dcjg8" podUID="7be7e7ab-b82d-4802-9eba-8c9eb76668a3" Jan 29 11:06:15.298833 containerd[1478]: time="2025-01-29T11:06:15.298793933Z" level=error msg="Failed to destroy network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.300208 containerd[1478]: time="2025-01-29T11:06:15.299973973Z" level=error msg="encountered an error cleaning up failed sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.300208 containerd[1478]: time="2025-01-29T11:06:15.300043414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.301361 kubelet[2815]: E0129 11:06:15.301066 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.301361 kubelet[2815]: E0129 11:06:15.301130 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:15.301361 kubelet[2815]: E0129 11:06:15.301149 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mtvgj" Jan 29 11:06:15.301496 kubelet[2815]: E0129 11:06:15.301190 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mtvgj_calico-system(66d59454-c196-4ace-a57f-96550c417a39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mtvgj" podUID="66d59454-c196-4ace-a57f-96550c417a39" Jan 29 11:06:15.309428 containerd[1478]: time="2025-01-29T11:06:15.309310781Z" level=error msg="Failed to destroy network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.309876 containerd[1478]: time="2025-01-29T11:06:15.309755181Z" level=error msg="encountered an error cleaning up failed sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.309876 containerd[1478]: time="2025-01-29T11:06:15.309833741Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.310743 kubelet[2815]: E0129 11:06:15.310317 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.310743 kubelet[2815]: E0129 11:06:15.310365 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:15.310743 kubelet[2815]: E0129 11:06:15.310389 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-mff4v" Jan 29 11:06:15.310917 kubelet[2815]: E0129 11:06:15.310426 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-mff4v_kube-system(f133f1ed-4ff6-4186-84a9-0e6e2dda3b55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-mff4v" podUID="f133f1ed-4ff6-4186-84a9-0e6e2dda3b55" Jan 29 11:06:15.327230 containerd[1478]: time="2025-01-29T11:06:15.326934834Z" level=error msg="Failed to destroy network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.328259 containerd[1478]: time="2025-01-29T11:06:15.328142635Z" level=error msg="encountered an error cleaning up failed sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.328259 containerd[1478]: time="2025-01-29T11:06:15.328213395Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.328807 kubelet[2815]: E0129 11:06:15.328633 2815 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 11:06:15.328807 kubelet[2815]: E0129 11:06:15.328719 2815 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:15.328807 kubelet[2815]: E0129 11:06:15.328742 2815 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" Jan 29 11:06:15.329166 kubelet[2815]: E0129 11:06:15.328803 2815 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f785fb85f-vp8hz_calico-apiserver(03412989-750a-48ed-b795-b7c29a91242d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" podUID="03412989-750a-48ed-b795-b7c29a91242d" Jan 29 11:06:15.578880 systemd[1]: run-netns-cni\x2df4c37279\x2dc250\x2d6385\x2d613e\x2da40c23ec7e79.mount: Deactivated successfully. Jan 29 11:06:15.578972 systemd[1]: run-netns-cni\x2d21924d34\x2d37a2\x2dcca8\x2dca57\x2d56f06af83e22.mount: Deactivated successfully. Jan 29 11:06:15.711748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3169883171.mount: Deactivated successfully. Jan 29 11:06:15.744747 containerd[1478]: time="2025-01-29T11:06:15.742892150Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:15.744747 containerd[1478]: time="2025-01-29T11:06:15.744232311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=137671762" Jan 29 11:06:15.744747 containerd[1478]: time="2025-01-29T11:06:15.744589831Z" level=info msg="ImageCreate event name:\"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:15.747038 containerd[1478]: time="2025-01-29T11:06:15.746981313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:15.748189 containerd[1478]: time="2025-01-29T11:06:15.748154714Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"137671624\" in 4.901066876s" Jan 29 11:06:15.748313 containerd[1478]: time="2025-01-29T11:06:15.748289554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\"" Jan 29 11:06:15.763841 containerd[1478]: time="2025-01-29T11:06:15.763785725Z" level=info msg="CreateContainer within sandbox \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 29 11:06:15.787807 containerd[1478]: time="2025-01-29T11:06:15.787756184Z" level=info msg="CreateContainer within sandbox \"1521a341295dfadba091c81b4f102416afb3f52efd2387af64e974cd2fe6b9d5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154\"" Jan 29 11:06:15.788850 containerd[1478]: time="2025-01-29T11:06:15.788564304Z" level=info msg="StartContainer for \"9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154\"" Jan 29 11:06:15.818066 systemd[1]: Started cri-containerd-9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154.scope - libcontainer container 9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154. Jan 29 11:06:15.855919 containerd[1478]: time="2025-01-29T11:06:15.855210635Z" level=info msg="StartContainer for \"9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154\" returns successfully" Jan 29 11:06:15.962169 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 29 11:06:15.962282 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 29 11:06:16.050023 kubelet[2815]: I0129 11:06:16.049905 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313" Jan 29 11:06:16.051864 containerd[1478]: time="2025-01-29T11:06:16.051300379Z" level=info msg="StopPodSandbox for \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\"" Jan 29 11:06:16.051864 containerd[1478]: time="2025-01-29T11:06:16.051698179Z" level=info msg="Ensure that sandbox 67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313 in task-service has been cleanup successfully" Jan 29 11:06:16.052204 containerd[1478]: time="2025-01-29T11:06:16.051909779Z" level=info msg="TearDown network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\" successfully" Jan 29 11:06:16.052204 containerd[1478]: time="2025-01-29T11:06:16.051953979Z" level=info msg="StopPodSandbox for \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\" returns successfully" Jan 29 11:06:16.052764 containerd[1478]: time="2025-01-29T11:06:16.052725460Z" level=info msg="StopPodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\"" Jan 29 11:06:16.053192 containerd[1478]: time="2025-01-29T11:06:16.052805420Z" level=info msg="TearDown network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" successfully" Jan 29 11:06:16.053192 containerd[1478]: time="2025-01-29T11:06:16.052827220Z" level=info msg="StopPodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" returns successfully" Jan 29 11:06:16.053192 containerd[1478]: time="2025-01-29T11:06:16.053143340Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" Jan 29 11:06:16.053905 containerd[1478]: time="2025-01-29T11:06:16.053238940Z" level=info msg="TearDown network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" successfully" Jan 29 11:06:16.053905 containerd[1478]: time="2025-01-29T11:06:16.053250380Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" returns successfully" Jan 29 11:06:16.053905 containerd[1478]: time="2025-01-29T11:06:16.053841341Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:16.054020 containerd[1478]: time="2025-01-29T11:06:16.053933141Z" level=info msg="TearDown network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" successfully" Jan 29 11:06:16.054052 containerd[1478]: time="2025-01-29T11:06:16.053943261Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" returns successfully" Jan 29 11:06:16.055079 containerd[1478]: time="2025-01-29T11:06:16.054794941Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:16.055859 containerd[1478]: time="2025-01-29T11:06:16.055189662Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:16.055859 containerd[1478]: time="2025-01-29T11:06:16.055209702Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:16.055859 containerd[1478]: time="2025-01-29T11:06:16.055651542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:5,}" Jan 29 11:06:16.056438 kubelet[2815]: I0129 11:06:16.056328 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a" Jan 29 11:06:16.058528 containerd[1478]: time="2025-01-29T11:06:16.058473784Z" level=info msg="StopPodSandbox for \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\"" Jan 29 11:06:16.058647 containerd[1478]: time="2025-01-29T11:06:16.058625624Z" level=info msg="Ensure that sandbox 64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a in task-service has been cleanup successfully" Jan 29 11:06:16.061229 containerd[1478]: time="2025-01-29T11:06:16.060533745Z" level=info msg="TearDown network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\" successfully" Jan 29 11:06:16.061229 containerd[1478]: time="2025-01-29T11:06:16.060556145Z" level=info msg="StopPodSandbox for \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\" returns successfully" Jan 29 11:06:16.062736 containerd[1478]: time="2025-01-29T11:06:16.062541906Z" level=info msg="StopPodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\"" Jan 29 11:06:16.063791 containerd[1478]: time="2025-01-29T11:06:16.063538027Z" level=info msg="TearDown network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" successfully" Jan 29 11:06:16.063791 containerd[1478]: time="2025-01-29T11:06:16.063560227Z" level=info msg="StopPodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" returns successfully" Jan 29 11:06:16.066179 containerd[1478]: time="2025-01-29T11:06:16.065258228Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" Jan 29 11:06:16.066179 containerd[1478]: time="2025-01-29T11:06:16.065349268Z" level=info msg="TearDown network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" successfully" Jan 29 11:06:16.066179 containerd[1478]: time="2025-01-29T11:06:16.065358748Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" returns successfully" Jan 29 11:06:16.066759 containerd[1478]: time="2025-01-29T11:06:16.066427189Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:16.066759 containerd[1478]: time="2025-01-29T11:06:16.066541549Z" level=info msg="TearDown network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" successfully" Jan 29 11:06:16.066759 containerd[1478]: time="2025-01-29T11:06:16.066552189Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" returns successfully" Jan 29 11:06:16.068841 containerd[1478]: time="2025-01-29T11:06:16.068105750Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:16.068841 containerd[1478]: time="2025-01-29T11:06:16.068200150Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:16.068841 containerd[1478]: time="2025-01-29T11:06:16.068210150Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:16.069444 containerd[1478]: time="2025-01-29T11:06:16.069216471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:5,}" Jan 29 11:06:16.071269 kubelet[2815]: I0129 11:06:16.070701 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b" Jan 29 11:06:16.073250 containerd[1478]: time="2025-01-29T11:06:16.072963113Z" level=info msg="StopPodSandbox for \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\"" Jan 29 11:06:16.074911 containerd[1478]: time="2025-01-29T11:06:16.074501234Z" level=info msg="Ensure that sandbox 76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b in task-service has been cleanup successfully" Jan 29 11:06:16.076455 containerd[1478]: time="2025-01-29T11:06:16.076425196Z" level=info msg="TearDown network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\" successfully" Jan 29 11:06:16.078006 containerd[1478]: time="2025-01-29T11:06:16.077980037Z" level=info msg="StopPodSandbox for \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\" returns successfully" Jan 29 11:06:16.079439 containerd[1478]: time="2025-01-29T11:06:16.079277438Z" level=info msg="StopPodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\"" Jan 29 11:06:16.080747 kubelet[2815]: I0129 11:06:16.079781 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af" Jan 29 11:06:16.081029 containerd[1478]: time="2025-01-29T11:06:16.080933599Z" level=info msg="TearDown network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" successfully" Jan 29 11:06:16.081029 containerd[1478]: time="2025-01-29T11:06:16.080957999Z" level=info msg="StopPodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" returns successfully" Jan 29 11:06:16.081978 containerd[1478]: time="2025-01-29T11:06:16.081550799Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" Jan 29 11:06:16.081978 containerd[1478]: time="2025-01-29T11:06:16.081663999Z" level=info msg="TearDown network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" successfully" Jan 29 11:06:16.081978 containerd[1478]: time="2025-01-29T11:06:16.081674919Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" returns successfully" Jan 29 11:06:16.083603 containerd[1478]: time="2025-01-29T11:06:16.082982880Z" level=info msg="StopPodSandbox for \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\"" Jan 29 11:06:16.083603 containerd[1478]: time="2025-01-29T11:06:16.083107720Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:16.083603 containerd[1478]: time="2025-01-29T11:06:16.083180360Z" level=info msg="TearDown network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" successfully" Jan 29 11:06:16.083603 containerd[1478]: time="2025-01-29T11:06:16.083189360Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" returns successfully" Jan 29 11:06:16.083603 containerd[1478]: time="2025-01-29T11:06:16.083192600Z" level=info msg="Ensure that sandbox 384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af in task-service has been cleanup successfully" Jan 29 11:06:16.084079 containerd[1478]: time="2025-01-29T11:06:16.083891361Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:16.085053 containerd[1478]: time="2025-01-29T11:06:16.084199841Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:16.085053 containerd[1478]: time="2025-01-29T11:06:16.084222601Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:16.087969 containerd[1478]: time="2025-01-29T11:06:16.087663123Z" level=info msg="TearDown network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\" successfully" Jan 29 11:06:16.088258 containerd[1478]: time="2025-01-29T11:06:16.088221604Z" level=info msg="StopPodSandbox for \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\" returns successfully" Jan 29 11:06:16.088309 containerd[1478]: time="2025-01-29T11:06:16.088128243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:5,}" Jan 29 11:06:16.089367 kubelet[2815]: I0129 11:06:16.089058 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273" Jan 29 11:06:16.093670 containerd[1478]: time="2025-01-29T11:06:16.093633647Z" level=info msg="StopPodSandbox for \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\"" Jan 29 11:06:16.094564 containerd[1478]: time="2025-01-29T11:06:16.094528008Z" level=info msg="StopPodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\"" Jan 29 11:06:16.094677 containerd[1478]: time="2025-01-29T11:06:16.094624608Z" level=info msg="TearDown network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" successfully" Jan 29 11:06:16.094677 containerd[1478]: time="2025-01-29T11:06:16.094635088Z" level=info msg="StopPodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" returns successfully" Jan 29 11:06:16.096434 containerd[1478]: time="2025-01-29T11:06:16.096181249Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" Jan 29 11:06:16.096434 containerd[1478]: time="2025-01-29T11:06:16.096271809Z" level=info msg="TearDown network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" successfully" Jan 29 11:06:16.096434 containerd[1478]: time="2025-01-29T11:06:16.096280809Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" returns successfully" Jan 29 11:06:16.097206 containerd[1478]: time="2025-01-29T11:06:16.097165769Z" level=info msg="Ensure that sandbox 20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273 in task-service has been cleanup successfully" Jan 29 11:06:16.099650 kubelet[2815]: I0129 11:06:16.099576 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ztccw" podStartSLOduration=1.6890102809999998 podStartE2EDuration="15.099557731s" podCreationTimestamp="2025-01-29 11:06:01 +0000 UTC" firstStartedPulling="2025-01-29 11:06:02.338615344 +0000 UTC m=+21.786971410" lastFinishedPulling="2025-01-29 11:06:15.749162754 +0000 UTC m=+35.197518860" observedRunningTime="2025-01-29 11:06:16.095722208 +0000 UTC m=+35.544078314" watchObservedRunningTime="2025-01-29 11:06:16.099557731 +0000 UTC m=+35.547913837" Jan 29 11:06:16.104270 containerd[1478]: time="2025-01-29T11:06:16.100412692Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:16.104987 containerd[1478]: time="2025-01-29T11:06:16.104338654Z" level=info msg="TearDown network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" successfully" Jan 29 11:06:16.104987 containerd[1478]: time="2025-01-29T11:06:16.104574574Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" returns successfully" Jan 29 11:06:16.104987 containerd[1478]: time="2025-01-29T11:06:16.103411174Z" level=info msg="TearDown network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\" successfully" Jan 29 11:06:16.104987 containerd[1478]: time="2025-01-29T11:06:16.104629014Z" level=info msg="StopPodSandbox for \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\" returns successfully" Jan 29 11:06:16.108355 containerd[1478]: time="2025-01-29T11:06:16.108261977Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:16.108866 containerd[1478]: time="2025-01-29T11:06:16.108714297Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:16.108866 containerd[1478]: time="2025-01-29T11:06:16.108790217Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:16.109518 containerd[1478]: time="2025-01-29T11:06:16.109056817Z" level=info msg="StopPodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\"" Jan 29 11:06:16.109518 containerd[1478]: time="2025-01-29T11:06:16.109135937Z" level=info msg="TearDown network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" successfully" Jan 29 11:06:16.109518 containerd[1478]: time="2025-01-29T11:06:16.109145017Z" level=info msg="StopPodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" returns successfully" Jan 29 11:06:16.112425 containerd[1478]: time="2025-01-29T11:06:16.112162059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:5,}" Jan 29 11:06:16.112425 containerd[1478]: time="2025-01-29T11:06:16.112223659Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" Jan 29 11:06:16.112425 containerd[1478]: time="2025-01-29T11:06:16.112299780Z" level=info msg="TearDown network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" successfully" Jan 29 11:06:16.112425 containerd[1478]: time="2025-01-29T11:06:16.112309060Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" returns successfully" Jan 29 11:06:16.114055 containerd[1478]: time="2025-01-29T11:06:16.114022061Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:16.114455 containerd[1478]: time="2025-01-29T11:06:16.114424061Z" level=info msg="TearDown network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" successfully" Jan 29 11:06:16.114632 containerd[1478]: time="2025-01-29T11:06:16.114514501Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" returns successfully" Jan 29 11:06:16.115720 containerd[1478]: time="2025-01-29T11:06:16.115517702Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:16.116067 containerd[1478]: time="2025-01-29T11:06:16.115965982Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:16.116917 containerd[1478]: time="2025-01-29T11:06:16.116836223Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:16.117840 containerd[1478]: time="2025-01-29T11:06:16.117629223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:5,}" Jan 29 11:06:16.119859 kubelet[2815]: I0129 11:06:16.119564 2815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3" Jan 29 11:06:16.121370 containerd[1478]: time="2025-01-29T11:06:16.121071105Z" level=info msg="StopPodSandbox for \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\"" Jan 29 11:06:16.121435 containerd[1478]: time="2025-01-29T11:06:16.121379986Z" level=info msg="Ensure that sandbox 58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3 in task-service has been cleanup successfully" Jan 29 11:06:16.122197 containerd[1478]: time="2025-01-29T11:06:16.121674586Z" level=info msg="TearDown network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\" successfully" Jan 29 11:06:16.122197 containerd[1478]: time="2025-01-29T11:06:16.121742026Z" level=info msg="StopPodSandbox for \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\" returns successfully" Jan 29 11:06:16.122431 containerd[1478]: time="2025-01-29T11:06:16.122305026Z" level=info msg="StopPodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\"" Jan 29 11:06:16.122633 containerd[1478]: time="2025-01-29T11:06:16.122601346Z" level=info msg="TearDown network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" successfully" Jan 29 11:06:16.122633 containerd[1478]: time="2025-01-29T11:06:16.122620226Z" level=info msg="StopPodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" returns successfully" Jan 29 11:06:16.123453 containerd[1478]: time="2025-01-29T11:06:16.123191107Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" Jan 29 11:06:16.123453 containerd[1478]: time="2025-01-29T11:06:16.123312987Z" level=info msg="TearDown network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" successfully" Jan 29 11:06:16.123453 containerd[1478]: time="2025-01-29T11:06:16.123339107Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" returns successfully" Jan 29 11:06:16.124212 containerd[1478]: time="2025-01-29T11:06:16.123746787Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:16.126092 containerd[1478]: time="2025-01-29T11:06:16.125982149Z" level=info msg="TearDown network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" successfully" Jan 29 11:06:16.126092 containerd[1478]: time="2025-01-29T11:06:16.126010589Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" returns successfully" Jan 29 11:06:16.128032 containerd[1478]: time="2025-01-29T11:06:16.127991470Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:16.129405 containerd[1478]: time="2025-01-29T11:06:16.129260511Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:16.129405 containerd[1478]: time="2025-01-29T11:06:16.129290151Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:16.130442 containerd[1478]: time="2025-01-29T11:06:16.130196511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:5,}" Jan 29 11:06:16.582749 systemd[1]: run-netns-cni\x2d51665cd5\x2d131f\x2dce9b\x2d5671\x2d929fa904a1b0.mount: Deactivated successfully. Jan 29 11:06:16.582851 systemd[1]: run-netns-cni\x2d9db82d08\x2da085\x2d832a\x2d5807\x2d9d84b4ee6409.mount: Deactivated successfully. Jan 29 11:06:16.582903 systemd[1]: run-netns-cni\x2dc6221201\x2d2bc9\x2d0195\x2d8502\x2de8bb856004ce.mount: Deactivated successfully. Jan 29 11:06:16.582946 systemd[1]: run-netns-cni\x2dc6b9628b\x2d4baa\x2d6e02\x2d06ed\x2d93834c42f7ca.mount: Deactivated successfully. Jan 29 11:06:16.582991 systemd[1]: run-netns-cni\x2db64428ae\x2d78d0\x2de1bc\x2db165\x2dea85b3bf73fb.mount: Deactivated successfully. Jan 29 11:06:16.583031 systemd[1]: run-netns-cni\x2d74d8e023\x2df3fa\x2d0050\x2d7787\x2d1157125a71b5.mount: Deactivated successfully. Jan 29 11:06:16.629321 systemd-networkd[1383]: cali3776b9ddb34: Link UP Jan 29 11:06:16.630365 systemd-networkd[1383]: cali3776b9ddb34: Gained carrier Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.187 [INFO][4606] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.257 [INFO][4606] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0 calico-kube-controllers-d6547ffc8- calico-system 4dcc55b2-baa9-4b75-9e62-2012ad104fe8 710 0 2025-01-29 11:06:02 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:d6547ffc8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4152-2-0-b-6e231d00a9 calico-kube-controllers-d6547ffc8-zlmj5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3776b9ddb34 [] []}} ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.257 [INFO][4606] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.449 [INFO][4668] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" HandleID="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.486 [INFO][4668] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" HandleID="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400057e820), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4152-2-0-b-6e231d00a9", "pod":"calico-kube-controllers-d6547ffc8-zlmj5", "timestamp":"2025-01-29 11:06:16.449824444 +0000 UTC"}, Hostname:"ci-4152-2-0-b-6e231d00a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.487 [INFO][4668] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.489 [INFO][4668] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.490 [INFO][4668] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152-2-0-b-6e231d00a9' Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.496 [INFO][4668] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.529 [INFO][4668] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.558 [INFO][4668] ipam/ipam.go 489: Trying affinity for 192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.567 [INFO][4668] ipam/ipam.go 155: Attempting to load block cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.577 [INFO][4668] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.577 [INFO][4668] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.591 [INFO][4668] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.602 [INFO][4668] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.610 [INFO][4668] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.86.193/26] block=192.168.86.192/26 handle="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.610 [INFO][4668] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.86.193/26] handle="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.610 [INFO][4668] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 11:06:16.660400 containerd[1478]: 2025-01-29 11:06:16.610 [INFO][4668] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.86.193/26] IPv6=[] ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" HandleID="k8s-pod-network.1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.661420 containerd[1478]: 2025-01-29 11:06:16.614 [INFO][4606] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0", GenerateName:"calico-kube-controllers-d6547ffc8-", Namespace:"calico-system", SelfLink:"", UID:"4dcc55b2-baa9-4b75-9e62-2012ad104fe8", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d6547ffc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"", Pod:"calico-kube-controllers-d6547ffc8-zlmj5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.86.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3776b9ddb34", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.661420 containerd[1478]: 2025-01-29 11:06:16.614 [INFO][4606] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.86.193/32] ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.661420 containerd[1478]: 2025-01-29 11:06:16.614 [INFO][4606] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3776b9ddb34 ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.661420 containerd[1478]: 2025-01-29 11:06:16.630 [INFO][4606] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.661420 containerd[1478]: 2025-01-29 11:06:16.632 [INFO][4606] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0", GenerateName:"calico-kube-controllers-d6547ffc8-", Namespace:"calico-system", SelfLink:"", UID:"4dcc55b2-baa9-4b75-9e62-2012ad104fe8", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d6547ffc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b", Pod:"calico-kube-controllers-d6547ffc8-zlmj5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.86.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3776b9ddb34", MAC:"4e:57:3a:fe:84:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.661420 containerd[1478]: 2025-01-29 11:06:16.653 [INFO][4606] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b" Namespace="calico-system" Pod="calico-kube-controllers-d6547ffc8-zlmj5" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--kube--controllers--d6547ffc8--zlmj5-eth0" Jan 29 11:06:16.684497 systemd-networkd[1383]: cali87b05a793d9: Link UP Jan 29 11:06:16.685349 systemd-networkd[1383]: cali87b05a793d9: Gained carrier Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.305 [INFO][4623] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.354 [INFO][4623] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0 calico-apiserver-f785fb85f- calico-apiserver 7a04fac7-1f8b-48e6-9fb1-4421bdb042d6 708 0 2025-01-29 11:06:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f785fb85f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4152-2-0-b-6e231d00a9 calico-apiserver-f785fb85f-zvlk7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali87b05a793d9 [] []}} ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.354 [INFO][4623] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.476 [INFO][4687] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" HandleID="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.494 [INFO][4687] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" HandleID="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fbb10), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4152-2-0-b-6e231d00a9", "pod":"calico-apiserver-f785fb85f-zvlk7", "timestamp":"2025-01-29 11:06:16.475670421 +0000 UTC"}, Hostname:"ci-4152-2-0-b-6e231d00a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.494 [INFO][4687] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.611 [INFO][4687] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.611 [INFO][4687] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152-2-0-b-6e231d00a9' Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.615 [INFO][4687] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.625 [INFO][4687] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.635 [INFO][4687] ipam/ipam.go 489: Trying affinity for 192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.641 [INFO][4687] ipam/ipam.go 155: Attempting to load block cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.646 [INFO][4687] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.647 [INFO][4687] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.649 [INFO][4687] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34 Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.657 [INFO][4687] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.672 [INFO][4687] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.86.194/26] block=192.168.86.192/26 handle="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.672 [INFO][4687] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.86.194/26] handle="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.672 [INFO][4687] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 11:06:16.709861 containerd[1478]: 2025-01-29 11:06:16.672 [INFO][4687] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.86.194/26] IPv6=[] ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" HandleID="k8s-pod-network.e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.711767 containerd[1478]: 2025-01-29 11:06:16.678 [INFO][4623] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0", GenerateName:"calico-apiserver-f785fb85f-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a04fac7-1f8b-48e6-9fb1-4421bdb042d6", ResourceVersion:"708", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f785fb85f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"", Pod:"calico-apiserver-f785fb85f-zvlk7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali87b05a793d9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.711767 containerd[1478]: 2025-01-29 11:06:16.678 [INFO][4623] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.86.194/32] ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.711767 containerd[1478]: 2025-01-29 11:06:16.678 [INFO][4623] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali87b05a793d9 ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.711767 containerd[1478]: 2025-01-29 11:06:16.683 [INFO][4623] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.711767 containerd[1478]: 2025-01-29 11:06:16.686 [INFO][4623] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0", GenerateName:"calico-apiserver-f785fb85f-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a04fac7-1f8b-48e6-9fb1-4421bdb042d6", ResourceVersion:"708", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f785fb85f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34", Pod:"calico-apiserver-f785fb85f-zvlk7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali87b05a793d9", MAC:"6e:63:f3:b7:f7:f6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.711767 containerd[1478]: 2025-01-29 11:06:16.706 [INFO][4623] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-zvlk7" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--zvlk7-eth0" Jan 29 11:06:16.730856 containerd[1478]: time="2025-01-29T11:06:16.729587270Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:16.730856 containerd[1478]: time="2025-01-29T11:06:16.729954390Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:16.730856 containerd[1478]: time="2025-01-29T11:06:16.729974150Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:16.730856 containerd[1478]: time="2025-01-29T11:06:16.730078431Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:16.756451 systemd-networkd[1383]: cali5a46d4f5a80: Link UP Jan 29 11:06:16.757545 systemd-networkd[1383]: cali5a46d4f5a80: Gained carrier Jan 29 11:06:16.787548 containerd[1478]: time="2025-01-29T11:06:16.786518908Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:16.787548 containerd[1478]: time="2025-01-29T11:06:16.786576828Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:16.787548 containerd[1478]: time="2025-01-29T11:06:16.786588988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:16.787548 containerd[1478]: time="2025-01-29T11:06:16.786672148Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.171 [INFO][4595] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.257 [INFO][4595] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0 csi-node-driver- calico-system 66d59454-c196-4ace-a57f-96550c417a39 583 0 2025-01-29 11:06:02 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4152-2-0-b-6e231d00a9 csi-node-driver-mtvgj eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5a46d4f5a80 [] []}} ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.257 [INFO][4595] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.473 [INFO][4677] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" HandleID="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Workload="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.529 [INFO][4677] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" HandleID="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Workload="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003aa480), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4152-2-0-b-6e231d00a9", "pod":"csi-node-driver-mtvgj", "timestamp":"2025-01-29 11:06:16.47384642 +0000 UTC"}, Hostname:"ci-4152-2-0-b-6e231d00a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.533 [INFO][4677] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.672 [INFO][4677] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.672 [INFO][4677] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152-2-0-b-6e231d00a9' Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.675 [INFO][4677] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.685 [INFO][4677] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.695 [INFO][4677] ipam/ipam.go 489: Trying affinity for 192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.703 [INFO][4677] ipam/ipam.go 155: Attempting to load block cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.711 [INFO][4677] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.711 [INFO][4677] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.716 [INFO][4677] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88 Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.726 [INFO][4677] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.735 [INFO][4677] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.86.195/26] block=192.168.86.192/26 handle="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.736 [INFO][4677] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.86.195/26] handle="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.736 [INFO][4677] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 11:06:16.790468 containerd[1478]: 2025-01-29 11:06:16.736 [INFO][4677] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.86.195/26] IPv6=[] ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" HandleID="k8s-pod-network.c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Workload="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.793202 containerd[1478]: 2025-01-29 11:06:16.746 [INFO][4595] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"66d59454-c196-4ace-a57f-96550c417a39", ResourceVersion:"583", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"", Pod:"csi-node-driver-mtvgj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.86.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5a46d4f5a80", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.793202 containerd[1478]: 2025-01-29 11:06:16.747 [INFO][4595] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.86.195/32] ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.793202 containerd[1478]: 2025-01-29 11:06:16.747 [INFO][4595] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5a46d4f5a80 ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.793202 containerd[1478]: 2025-01-29 11:06:16.758 [INFO][4595] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.793202 containerd[1478]: 2025-01-29 11:06:16.764 [INFO][4595] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"66d59454-c196-4ace-a57f-96550c417a39", ResourceVersion:"583", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88", Pod:"csi-node-driver-mtvgj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.86.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5a46d4f5a80", MAC:"36:91:b8:cb:34:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.793202 containerd[1478]: 2025-01-29 11:06:16.783 [INFO][4595] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88" Namespace="calico-system" Pod="csi-node-driver-mtvgj" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-csi--node--driver--mtvgj-eth0" Jan 29 11:06:16.798786 systemd[1]: Started cri-containerd-1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b.scope - libcontainer container 1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b. Jan 29 11:06:16.835050 systemd[1]: Started cri-containerd-e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34.scope - libcontainer container e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34. Jan 29 11:06:16.837555 systemd-networkd[1383]: calie223d541e57: Link UP Jan 29 11:06:16.837738 systemd-networkd[1383]: calie223d541e57: Gained carrier Jan 29 11:06:16.852366 containerd[1478]: time="2025-01-29T11:06:16.851437151Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:16.852899 containerd[1478]: time="2025-01-29T11:06:16.852113232Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:16.852899 containerd[1478]: time="2025-01-29T11:06:16.852137632Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:16.852899 containerd[1478]: time="2025-01-29T11:06:16.852534432Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.275 [INFO][4627] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.340 [INFO][4627] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0 coredns-7db6d8ff4d- kube-system 7be7e7ab-b82d-4802-9eba-8c9eb76668a3 706 0 2025-01-29 11:05:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4152-2-0-b-6e231d00a9 coredns-7db6d8ff4d-dcjg8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie223d541e57 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.341 [INFO][4627] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.510 [INFO][4683] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" HandleID="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Workload="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.567 [INFO][4683] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" HandleID="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Workload="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001f8180), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4152-2-0-b-6e231d00a9", "pod":"coredns-7db6d8ff4d-dcjg8", "timestamp":"2025-01-29 11:06:16.510114124 +0000 UTC"}, Hostname:"ci-4152-2-0-b-6e231d00a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.568 [INFO][4683] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.736 [INFO][4683] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.737 [INFO][4683] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152-2-0-b-6e231d00a9' Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.747 [INFO][4683] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.759 [INFO][4683] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.773 [INFO][4683] ipam/ipam.go 489: Trying affinity for 192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.782 [INFO][4683] ipam/ipam.go 155: Attempting to load block cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.791 [INFO][4683] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.791 [INFO][4683] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.796 [INFO][4683] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.809 [INFO][4683] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.828 [INFO][4683] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.86.196/26] block=192.168.86.192/26 handle="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.828 [INFO][4683] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.86.196/26] handle="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.828 [INFO][4683] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 11:06:16.876166 containerd[1478]: 2025-01-29 11:06:16.828 [INFO][4683] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.86.196/26] IPv6=[] ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" HandleID="k8s-pod-network.4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Workload="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.877996 containerd[1478]: 2025-01-29 11:06:16.833 [INFO][4627] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7be7e7ab-b82d-4802-9eba-8c9eb76668a3", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 5, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"", Pod:"coredns-7db6d8ff4d-dcjg8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie223d541e57", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.877996 containerd[1478]: 2025-01-29 11:06:16.834 [INFO][4627] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.86.196/32] ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.877996 containerd[1478]: 2025-01-29 11:06:16.834 [INFO][4627] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie223d541e57 ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.877996 containerd[1478]: 2025-01-29 11:06:16.837 [INFO][4627] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.877996 containerd[1478]: 2025-01-29 11:06:16.837 [INFO][4627] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7be7e7ab-b82d-4802-9eba-8c9eb76668a3", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 5, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e", Pod:"coredns-7db6d8ff4d-dcjg8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie223d541e57", MAC:"42:3c:ef:f3:9f:38", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.877996 containerd[1478]: 2025-01-29 11:06:16.868 [INFO][4627] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-dcjg8" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--dcjg8-eth0" Jan 29 11:06:16.883007 systemd[1]: Started cri-containerd-c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88.scope - libcontainer container c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88. Jan 29 11:06:16.909114 containerd[1478]: time="2025-01-29T11:06:16.909008750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6547ffc8-zlmj5,Uid:4dcc55b2-baa9-4b75-9e62-2012ad104fe8,Namespace:calico-system,Attempt:5,} returns sandbox id \"1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b\"" Jan 29 11:06:16.926446 containerd[1478]: time="2025-01-29T11:06:16.925131360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 29 11:06:16.949689 systemd-networkd[1383]: califde496e88cd: Link UP Jan 29 11:06:16.951054 systemd-networkd[1383]: califde496e88cd: Gained carrier Jan 29 11:06:16.976486 containerd[1478]: time="2025-01-29T11:06:16.976412354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mtvgj,Uid:66d59454-c196-4ace-a57f-96550c417a39,Namespace:calico-system,Attempt:5,} returns sandbox id \"c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88\"" Jan 29 11:06:16.988475 containerd[1478]: time="2025-01-29T11:06:16.987702882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-zvlk7,Uid:7a04fac7-1f8b-48e6-9fb1-4421bdb042d6,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34\"" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.303 [INFO][4654] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.354 [INFO][4654] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0 calico-apiserver-f785fb85f- calico-apiserver 03412989-750a-48ed-b795-b7c29a91242d 711 0 2025-01-29 11:06:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f785fb85f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4152-2-0-b-6e231d00a9 calico-apiserver-f785fb85f-vp8hz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califde496e88cd [] []}} ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.354 [INFO][4654] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.529 [INFO][4695] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" HandleID="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.571 [INFO][4695] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" HandleID="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400023b7b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4152-2-0-b-6e231d00a9", "pod":"calico-apiserver-f785fb85f-vp8hz", "timestamp":"2025-01-29 11:06:16.522730493 +0000 UTC"}, Hostname:"ci-4152-2-0-b-6e231d00a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.571 [INFO][4695] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.829 [INFO][4695] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.829 [INFO][4695] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152-2-0-b-6e231d00a9' Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.843 [INFO][4695] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.862 [INFO][4695] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.879 [INFO][4695] ipam/ipam.go 489: Trying affinity for 192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.885 [INFO][4695] ipam/ipam.go 155: Attempting to load block cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.889 [INFO][4695] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.889 [INFO][4695] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.891 [INFO][4695] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.909 [INFO][4695] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.923 [INFO][4695] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.86.197/26] block=192.168.86.192/26 handle="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.923 [INFO][4695] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.86.197/26] handle="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.924 [INFO][4695] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 11:06:16.994907 containerd[1478]: 2025-01-29 11:06:16.926 [INFO][4695] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.86.197/26] IPv6=[] ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" HandleID="k8s-pod-network.e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Workload="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:16.996308 containerd[1478]: 2025-01-29 11:06:16.933 [INFO][4654] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0", GenerateName:"calico-apiserver-f785fb85f-", Namespace:"calico-apiserver", SelfLink:"", UID:"03412989-750a-48ed-b795-b7c29a91242d", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f785fb85f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"", Pod:"calico-apiserver-f785fb85f-vp8hz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califde496e88cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.996308 containerd[1478]: 2025-01-29 11:06:16.933 [INFO][4654] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.86.197/32] ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:16.996308 containerd[1478]: 2025-01-29 11:06:16.933 [INFO][4654] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califde496e88cd ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:16.996308 containerd[1478]: 2025-01-29 11:06:16.959 [INFO][4654] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:16.996308 containerd[1478]: 2025-01-29 11:06:16.960 [INFO][4654] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0", GenerateName:"calico-apiserver-f785fb85f-", Namespace:"calico-apiserver", SelfLink:"", UID:"03412989-750a-48ed-b795-b7c29a91242d", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f785fb85f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e", Pod:"calico-apiserver-f785fb85f-vp8hz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califde496e88cd", MAC:"c2:da:17:ba:eb:fb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:16.996308 containerd[1478]: 2025-01-29 11:06:16.979 [INFO][4654] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e" Namespace="calico-apiserver" Pod="calico-apiserver-f785fb85f-vp8hz" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-calico--apiserver--f785fb85f--vp8hz-eth0" Jan 29 11:06:17.004089 containerd[1478]: time="2025-01-29T11:06:17.001568291Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:17.004245 containerd[1478]: time="2025-01-29T11:06:17.003967972Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:17.004245 containerd[1478]: time="2025-01-29T11:06:17.004037493Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:17.004989 containerd[1478]: time="2025-01-29T11:06:17.004559813Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:17.024248 systemd[1]: Started cri-containerd-4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e.scope - libcontainer container 4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e. Jan 29 11:06:17.035670 systemd-networkd[1383]: cali60a4f6018fe: Link UP Jan 29 11:06:17.036924 systemd-networkd[1383]: cali60a4f6018fe: Gained carrier Jan 29 11:06:17.040065 containerd[1478]: time="2025-01-29T11:06:17.038798952Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:17.040065 containerd[1478]: time="2025-01-29T11:06:17.039127273Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:17.040065 containerd[1478]: time="2025-01-29T11:06:17.039149073Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:17.040976 containerd[1478]: time="2025-01-29T11:06:17.040205473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:17.066603 systemd[1]: Started cri-containerd-e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e.scope - libcontainer container e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e. Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.291 [INFO][4649] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.346 [INFO][4649] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0 coredns-7db6d8ff4d- kube-system f133f1ed-4ff6-4186-84a9-0e6e2dda3b55 709 0 2025-01-29 11:05:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4152-2-0-b-6e231d00a9 coredns-7db6d8ff4d-mff4v eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali60a4f6018fe [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.348 [INFO][4649] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.550 [INFO][4684] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" HandleID="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Workload="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.581 [INFO][4684] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" HandleID="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Workload="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024f590), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4152-2-0-b-6e231d00a9", "pod":"coredns-7db6d8ff4d-mff4v", "timestamp":"2025-01-29 11:06:16.54950875 +0000 UTC"}, Hostname:"ci-4152-2-0-b-6e231d00a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.582 [INFO][4684] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.927 [INFO][4684] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.927 [INFO][4684] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4152-2-0-b-6e231d00a9' Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.934 [INFO][4684] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.957 [INFO][4684] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.983 [INFO][4684] ipam/ipam.go 489: Trying affinity for 192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:16.995 [INFO][4684] ipam/ipam.go 155: Attempting to load block cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.001 [INFO][4684] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.001 [INFO][4684] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.005 [INFO][4684] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2 Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.011 [INFO][4684] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.026 [INFO][4684] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.86.198/26] block=192.168.86.192/26 handle="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.026 [INFO][4684] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.86.198/26] handle="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" host="ci-4152-2-0-b-6e231d00a9" Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.026 [INFO][4684] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 11:06:17.067157 containerd[1478]: 2025-01-29 11:06:17.026 [INFO][4684] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.86.198/26] IPv6=[] ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" HandleID="k8s-pod-network.82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Workload="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.068322 containerd[1478]: 2025-01-29 11:06:17.029 [INFO][4649] cni-plugin/k8s.go 386: Populated endpoint ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f133f1ed-4ff6-4186-84a9-0e6e2dda3b55", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 5, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"", Pod:"coredns-7db6d8ff4d-mff4v", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali60a4f6018fe", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:17.068322 containerd[1478]: 2025-01-29 11:06:17.029 [INFO][4649] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.86.198/32] ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.068322 containerd[1478]: 2025-01-29 11:06:17.029 [INFO][4649] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60a4f6018fe ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.068322 containerd[1478]: 2025-01-29 11:06:17.037 [INFO][4649] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.068322 containerd[1478]: 2025-01-29 11:06:17.041 [INFO][4649] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f133f1ed-4ff6-4186-84a9-0e6e2dda3b55", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 11, 5, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4152-2-0-b-6e231d00a9", ContainerID:"82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2", Pod:"coredns-7db6d8ff4d-mff4v", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali60a4f6018fe", MAC:"76:26:f8:39:92:7a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 11:06:17.068322 containerd[1478]: 2025-01-29 11:06:17.059 [INFO][4649] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-mff4v" WorkloadEndpoint="ci--4152--2--0--b--6e231d00a9-k8s-coredns--7db6d8ff4d--mff4v-eth0" Jan 29 11:06:17.097102 containerd[1478]: time="2025-01-29T11:06:17.096996546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-dcjg8,Uid:7be7e7ab-b82d-4802-9eba-8c9eb76668a3,Namespace:kube-system,Attempt:5,} returns sandbox id \"4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e\"" Jan 29 11:06:17.106330 containerd[1478]: time="2025-01-29T11:06:17.106195711Z" level=info msg="CreateContainer within sandbox \"4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 29 11:06:17.126399 containerd[1478]: time="2025-01-29T11:06:17.126215723Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 11:06:17.126399 containerd[1478]: time="2025-01-29T11:06:17.126315843Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 11:06:17.127860 containerd[1478]: time="2025-01-29T11:06:17.126471203Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:17.127860 containerd[1478]: time="2025-01-29T11:06:17.126590483Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 11:06:17.132652 containerd[1478]: time="2025-01-29T11:06:17.132544086Z" level=info msg="CreateContainer within sandbox \"4a0803af52b7d25d7d33729df01c6218d0e797ae7f617e4c2d510714661f717e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b79594fa6c84b52763187470b940ddf5d03ace1b72fc1b75fba3cfea0a523dfa\"" Jan 29 11:06:17.134256 containerd[1478]: time="2025-01-29T11:06:17.134221927Z" level=info msg="StartContainer for \"b79594fa6c84b52763187470b940ddf5d03ace1b72fc1b75fba3cfea0a523dfa\"" Jan 29 11:06:17.139373 containerd[1478]: time="2025-01-29T11:06:17.139123290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f785fb85f-vp8hz,Uid:03412989-750a-48ed-b795-b7c29a91242d,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e\"" Jan 29 11:06:17.154926 kubelet[2815]: I0129 11:06:17.154171 2815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 11:06:17.162285 systemd[1]: Started cri-containerd-82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2.scope - libcontainer container 82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2. Jan 29 11:06:17.185337 systemd[1]: Started cri-containerd-b79594fa6c84b52763187470b940ddf5d03ace1b72fc1b75fba3cfea0a523dfa.scope - libcontainer container b79594fa6c84b52763187470b940ddf5d03ace1b72fc1b75fba3cfea0a523dfa. Jan 29 11:06:17.217314 containerd[1478]: time="2025-01-29T11:06:17.217262295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-mff4v,Uid:f133f1ed-4ff6-4186-84a9-0e6e2dda3b55,Namespace:kube-system,Attempt:5,} returns sandbox id \"82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2\"" Jan 29 11:06:17.223768 containerd[1478]: time="2025-01-29T11:06:17.223409579Z" level=info msg="CreateContainer within sandbox \"82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 29 11:06:17.224328 containerd[1478]: time="2025-01-29T11:06:17.224293379Z" level=info msg="StartContainer for \"b79594fa6c84b52763187470b940ddf5d03ace1b72fc1b75fba3cfea0a523dfa\" returns successfully" Jan 29 11:06:17.241709 containerd[1478]: time="2025-01-29T11:06:17.241645949Z" level=info msg="CreateContainer within sandbox \"82bc74319666264b0ef8e5851a494a43170e08bf4595162db5b2f00aac2aa9a2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3b38611edbe258cf00653f346cf6ba72b327b9efae9b3cd9b2d12db599bcae8c\"" Jan 29 11:06:17.244800 containerd[1478]: time="2025-01-29T11:06:17.243969390Z" level=info msg="StartContainer for \"3b38611edbe258cf00653f346cf6ba72b327b9efae9b3cd9b2d12db599bcae8c\"" Jan 29 11:06:17.275994 systemd[1]: Started cri-containerd-3b38611edbe258cf00653f346cf6ba72b327b9efae9b3cd9b2d12db599bcae8c.scope - libcontainer container 3b38611edbe258cf00653f346cf6ba72b327b9efae9b3cd9b2d12db599bcae8c. Jan 29 11:06:17.312990 containerd[1478]: time="2025-01-29T11:06:17.312775630Z" level=info msg="StartContainer for \"3b38611edbe258cf00653f346cf6ba72b327b9efae9b3cd9b2d12db599bcae8c\" returns successfully" Jan 29 11:06:17.959202 systemd-networkd[1383]: calie223d541e57: Gained IPv6LL Jan 29 11:06:18.023129 systemd-networkd[1383]: cali87b05a793d9: Gained IPv6LL Jan 29 11:06:18.087218 systemd-networkd[1383]: cali5a46d4f5a80: Gained IPv6LL Jan 29 11:06:18.187607 kubelet[2815]: I0129 11:06:18.187535 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-dcjg8" podStartSLOduration=24.187518556 podStartE2EDuration="24.187518556s" podCreationTimestamp="2025-01-29 11:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 11:06:18.185511475 +0000 UTC m=+37.633867621" watchObservedRunningTime="2025-01-29 11:06:18.187518556 +0000 UTC m=+37.635874662" Jan 29 11:06:18.215075 systemd-networkd[1383]: califde496e88cd: Gained IPv6LL Jan 29 11:06:18.228381 kubelet[2815]: I0129 11:06:18.227730 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-mff4v" podStartSLOduration=24.227712576 podStartE2EDuration="24.227712576s" podCreationTimestamp="2025-01-29 11:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 11:06:18.203789684 +0000 UTC m=+37.652145790" watchObservedRunningTime="2025-01-29 11:06:18.227712576 +0000 UTC m=+37.676068682" Jan 29 11:06:18.599225 systemd-networkd[1383]: cali3776b9ddb34: Gained IPv6LL Jan 29 11:06:18.599537 systemd-networkd[1383]: cali60a4f6018fe: Gained IPv6LL Jan 29 11:06:20.476759 containerd[1478]: time="2025-01-29T11:06:20.475947426Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=31953828" Jan 29 11:06:20.479299 containerd[1478]: time="2025-01-29T11:06:20.479265027Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"33323450\" in 3.552227905s" Jan 29 11:06:20.479417 containerd[1478]: time="2025-01-29T11:06:20.479402587Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\"" Jan 29 11:06:20.479977 containerd[1478]: time="2025-01-29T11:06:20.479939387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:20.480842 containerd[1478]: time="2025-01-29T11:06:20.480799467Z" level=info msg="ImageCreate event name:\"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:20.481569 containerd[1478]: time="2025-01-29T11:06:20.481543988Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:20.482459 containerd[1478]: time="2025-01-29T11:06:20.482435628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 29 11:06:20.497732 containerd[1478]: time="2025-01-29T11:06:20.497673313Z" level=info msg="CreateContainer within sandbox \"1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 29 11:06:20.548486 containerd[1478]: time="2025-01-29T11:06:20.548324769Z" level=info msg="CreateContainer within sandbox \"1f5c5c395c6c76b45a5eb088de1ef69ef98de22504fa1743d0aa3ba07622b78b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111\"" Jan 29 11:06:20.549615 containerd[1478]: time="2025-01-29T11:06:20.549574449Z" level=info msg="StartContainer for \"717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111\"" Jan 29 11:06:20.596066 systemd[1]: Started cri-containerd-717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111.scope - libcontainer container 717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111. Jan 29 11:06:20.637865 containerd[1478]: time="2025-01-29T11:06:20.636715637Z" level=info msg="StartContainer for \"717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111\" returns successfully" Jan 29 11:06:21.210511 kubelet[2815]: I0129 11:06:21.210449 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-d6547ffc8-zlmj5" podStartSLOduration=15.649971374 podStartE2EDuration="19.209809043s" podCreationTimestamp="2025-01-29 11:06:02 +0000 UTC" firstStartedPulling="2025-01-29 11:06:16.921529518 +0000 UTC m=+36.369885584" lastFinishedPulling="2025-01-29 11:06:20.481367147 +0000 UTC m=+39.929723253" observedRunningTime="2025-01-29 11:06:21.209157323 +0000 UTC m=+40.657513429" watchObservedRunningTime="2025-01-29 11:06:21.209809043 +0000 UTC m=+40.658165149" Jan 29 11:06:21.489933 systemd[1]: run-containerd-runc-k8s.io-717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111-runc.HFMhiU.mount: Deactivated successfully. Jan 29 11:06:22.090706 containerd[1478]: time="2025-01-29T11:06:22.090635528Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:22.093051 containerd[1478]: time="2025-01-29T11:06:22.092980768Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7464730" Jan 29 11:06:22.097376 containerd[1478]: time="2025-01-29T11:06:22.097298929Z" level=info msg="ImageCreate event name:\"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:22.105030 containerd[1478]: time="2025-01-29T11:06:22.104961770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:22.105727 containerd[1478]: time="2025-01-29T11:06:22.105674610Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"8834384\" in 1.623127022s" Jan 29 11:06:22.105727 containerd[1478]: time="2025-01-29T11:06:22.105708410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\"" Jan 29 11:06:22.107128 containerd[1478]: time="2025-01-29T11:06:22.107037450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 29 11:06:22.110701 containerd[1478]: time="2025-01-29T11:06:22.110621451Z" level=info msg="CreateContainer within sandbox \"c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 29 11:06:22.156739 containerd[1478]: time="2025-01-29T11:06:22.156590418Z" level=info msg="CreateContainer within sandbox \"c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"939d3acd6bbf91f6bd55bc158526ea2c2eb1c4946f4480d69a620215c1c61ef6\"" Jan 29 11:06:22.159490 containerd[1478]: time="2025-01-29T11:06:22.158675899Z" level=info msg="StartContainer for \"939d3acd6bbf91f6bd55bc158526ea2c2eb1c4946f4480d69a620215c1c61ef6\"" Jan 29 11:06:22.202139 systemd[1]: Started cri-containerd-939d3acd6bbf91f6bd55bc158526ea2c2eb1c4946f4480d69a620215c1c61ef6.scope - libcontainer container 939d3acd6bbf91f6bd55bc158526ea2c2eb1c4946f4480d69a620215c1c61ef6. Jan 29 11:06:22.265587 containerd[1478]: time="2025-01-29T11:06:22.265530836Z" level=info msg="StartContainer for \"939d3acd6bbf91f6bd55bc158526ea2c2eb1c4946f4480d69a620215c1c61ef6\" returns successfully" Jan 29 11:06:22.356519 kubelet[2815]: I0129 11:06:22.354051 2815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 11:06:23.974761 containerd[1478]: time="2025-01-29T11:06:23.974409360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:23.975746 containerd[1478]: time="2025-01-29T11:06:23.975701200Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=39298409" Jan 29 11:06:23.976460 containerd[1478]: time="2025-01-29T11:06:23.976352280Z" level=info msg="ImageCreate event name:\"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:23.986708 containerd[1478]: time="2025-01-29T11:06:23.986610881Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:23.987904 containerd[1478]: time="2025-01-29T11:06:23.987505361Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 1.880410911s" Jan 29 11:06:23.987904 containerd[1478]: time="2025-01-29T11:06:23.987544801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Jan 29 11:06:23.990475 containerd[1478]: time="2025-01-29T11:06:23.990411402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 29 11:06:23.993175 containerd[1478]: time="2025-01-29T11:06:23.993125562Z" level=info msg="CreateContainer within sandbox \"e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 29 11:06:24.013283 containerd[1478]: time="2025-01-29T11:06:24.013206443Z" level=info msg="CreateContainer within sandbox \"e7aa74018858704c5a81d688e8eba26e5c94b750f694db8571c095e482bd2b34\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d918ea4c739d155ca02c0144883e7237d978928849d581975cf76bccbf18a275\"" Jan 29 11:06:24.016902 containerd[1478]: time="2025-01-29T11:06:24.014205403Z" level=info msg="StartContainer for \"d918ea4c739d155ca02c0144883e7237d978928849d581975cf76bccbf18a275\"" Jan 29 11:06:24.057027 systemd[1]: Started cri-containerd-d918ea4c739d155ca02c0144883e7237d978928849d581975cf76bccbf18a275.scope - libcontainer container d918ea4c739d155ca02c0144883e7237d978928849d581975cf76bccbf18a275. Jan 29 11:06:24.115199 containerd[1478]: time="2025-01-29T11:06:24.115129724Z" level=info msg="StartContainer for \"d918ea4c739d155ca02c0144883e7237d978928849d581975cf76bccbf18a275\" returns successfully" Jan 29 11:06:24.368964 containerd[1478]: time="2025-01-29T11:06:24.368698328Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:24.369855 containerd[1478]: time="2025-01-29T11:06:24.369616448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 29 11:06:24.372613 containerd[1478]: time="2025-01-29T11:06:24.372481928Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"40668079\" in 382.013246ms" Jan 29 11:06:24.372613 containerd[1478]: time="2025-01-29T11:06:24.372513448Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Jan 29 11:06:24.374625 containerd[1478]: time="2025-01-29T11:06:24.374109848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 29 11:06:24.375506 containerd[1478]: time="2025-01-29T11:06:24.375469528Z" level=info msg="CreateContainer within sandbox \"e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 29 11:06:24.396139 containerd[1478]: time="2025-01-29T11:06:24.396008688Z" level=info msg="CreateContainer within sandbox \"e6492fb6acd453c4c71e8dfc00899d94311669a203c5d2ec2fbe9de6c95b976e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a9bde10629ee9529815eae8734c26b52691105a74df13c4a283372de43982bd2\"" Jan 29 11:06:24.396782 containerd[1478]: time="2025-01-29T11:06:24.396708808Z" level=info msg="StartContainer for \"a9bde10629ee9529815eae8734c26b52691105a74df13c4a283372de43982bd2\"" Jan 29 11:06:24.435021 systemd[1]: Started cri-containerd-a9bde10629ee9529815eae8734c26b52691105a74df13c4a283372de43982bd2.scope - libcontainer container a9bde10629ee9529815eae8734c26b52691105a74df13c4a283372de43982bd2. Jan 29 11:06:24.497737 containerd[1478]: time="2025-01-29T11:06:24.497291170Z" level=info msg="StartContainer for \"a9bde10629ee9529815eae8734c26b52691105a74df13c4a283372de43982bd2\" returns successfully" Jan 29 11:06:24.673069 kubelet[2815]: I0129 11:06:24.673011 2815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 11:06:24.689676 kubelet[2815]: I0129 11:06:24.689193 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f785fb85f-zvlk7" podStartSLOduration=15.690774256 podStartE2EDuration="22.689175733s" podCreationTimestamp="2025-01-29 11:06:02 +0000 UTC" firstStartedPulling="2025-01-29 11:06:16.990673924 +0000 UTC m=+36.439029990" lastFinishedPulling="2025-01-29 11:06:23.989075321 +0000 UTC m=+43.437431467" observedRunningTime="2025-01-29 11:06:24.254125566 +0000 UTC m=+43.702481672" watchObservedRunningTime="2025-01-29 11:06:24.689175733 +0000 UTC m=+44.137531839" Jan 29 11:06:25.247691 kubelet[2815]: I0129 11:06:25.246358 2815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 11:06:25.380908 kernel: bpftool[5598]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 29 11:06:26.101991 systemd-networkd[1383]: vxlan.calico: Link UP Jan 29 11:06:26.102000 systemd-networkd[1383]: vxlan.calico: Gained carrier Jan 29 11:06:26.538255 kubelet[2815]: I0129 11:06:26.537348 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f785fb85f-vp8hz" podStartSLOduration=17.307074259 podStartE2EDuration="24.537326015s" podCreationTimestamp="2025-01-29 11:06:02 +0000 UTC" firstStartedPulling="2025-01-29 11:06:17.143184052 +0000 UTC m=+36.591540158" lastFinishedPulling="2025-01-29 11:06:24.373435808 +0000 UTC m=+43.821791914" observedRunningTime="2025-01-29 11:06:25.269691082 +0000 UTC m=+44.718047148" watchObservedRunningTime="2025-01-29 11:06:26.537326015 +0000 UTC m=+45.985682121" Jan 29 11:06:27.457110 containerd[1478]: time="2025-01-29T11:06:27.457052631Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:27.458421 containerd[1478]: time="2025-01-29T11:06:27.458192511Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=9883368" Jan 29 11:06:27.459787 containerd[1478]: time="2025-01-29T11:06:27.459445151Z" level=info msg="ImageCreate event name:\"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:27.462307 containerd[1478]: time="2025-01-29T11:06:27.462247150Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 11:06:27.463494 containerd[1478]: time="2025-01-29T11:06:27.463456270Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11252974\" in 3.089314462s" Jan 29 11:06:27.463494 containerd[1478]: time="2025-01-29T11:06:27.463493950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\"" Jan 29 11:06:27.468033 containerd[1478]: time="2025-01-29T11:06:27.467996349Z" level=info msg="CreateContainer within sandbox \"c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 29 11:06:27.486237 containerd[1478]: time="2025-01-29T11:06:27.486160546Z" level=info msg="CreateContainer within sandbox \"c07e54096b719190bc0978f77f1261f66a9e806ac3a885a56315bcfe82f32a88\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3801eb10c3259f98a1cdd2b13806b537824b04f110c479b9c574c7984cb9c34f\"" Jan 29 11:06:27.488860 containerd[1478]: time="2025-01-29T11:06:27.488146426Z" level=info msg="StartContainer for \"3801eb10c3259f98a1cdd2b13806b537824b04f110c479b9c574c7984cb9c34f\"" Jan 29 11:06:27.521034 systemd[1]: Started cri-containerd-3801eb10c3259f98a1cdd2b13806b537824b04f110c479b9c574c7984cb9c34f.scope - libcontainer container 3801eb10c3259f98a1cdd2b13806b537824b04f110c479b9c574c7984cb9c34f. Jan 29 11:06:27.563117 containerd[1478]: time="2025-01-29T11:06:27.562681011Z" level=info msg="StartContainer for \"3801eb10c3259f98a1cdd2b13806b537824b04f110c479b9c574c7984cb9c34f\" returns successfully" Jan 29 11:06:27.801397 kubelet[2815]: I0129 11:06:27.801265 2815 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 29 11:06:27.805531 kubelet[2815]: I0129 11:06:27.805469 2815 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 29 11:06:28.007135 systemd-networkd[1383]: vxlan.calico: Gained IPv6LL Jan 29 11:06:28.285608 kubelet[2815]: I0129 11:06:28.284266 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-mtvgj" podStartSLOduration=15.805772467 podStartE2EDuration="26.284236536s" podCreationTimestamp="2025-01-29 11:06:02 +0000 UTC" firstStartedPulling="2025-01-29 11:06:16.986287841 +0000 UTC m=+36.434643947" lastFinishedPulling="2025-01-29 11:06:27.46475191 +0000 UTC m=+46.913108016" observedRunningTime="2025-01-29 11:06:28.282722057 +0000 UTC m=+47.731078163" watchObservedRunningTime="2025-01-29 11:06:28.284236536 +0000 UTC m=+47.732592682" Jan 29 11:06:32.401280 kubelet[2815]: I0129 11:06:32.401137 2815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 11:06:40.683364 containerd[1478]: time="2025-01-29T11:06:40.683317154Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:40.683746 containerd[1478]: time="2025-01-29T11:06:40.683485034Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:40.683746 containerd[1478]: time="2025-01-29T11:06:40.683522714Z" level=info msg="StopPodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:40.685247 containerd[1478]: time="2025-01-29T11:06:40.684342233Z" level=info msg="RemovePodSandbox for \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:40.685247 containerd[1478]: time="2025-01-29T11:06:40.684377673Z" level=info msg="Forcibly stopping sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\"" Jan 29 11:06:40.685247 containerd[1478]: time="2025-01-29T11:06:40.684458193Z" level=info msg="TearDown network for sandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" successfully" Jan 29 11:06:40.694646 containerd[1478]: time="2025-01-29T11:06:40.694597064Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.696243 containerd[1478]: time="2025-01-29T11:06:40.696013503Z" level=info msg="RemovePodSandbox \"7431624348c8c3464e1532ad1a1d1a9d93d3e25da8398608ad9d7546815e7aab\" returns successfully" Jan 29 11:06:40.698385 containerd[1478]: time="2025-01-29T11:06:40.698186541Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:40.698385 containerd[1478]: time="2025-01-29T11:06:40.698304461Z" level=info msg="TearDown network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" successfully" Jan 29 11:06:40.698385 containerd[1478]: time="2025-01-29T11:06:40.698313941Z" level=info msg="StopPodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" returns successfully" Jan 29 11:06:40.699001 containerd[1478]: time="2025-01-29T11:06:40.698857140Z" level=info msg="RemovePodSandbox for \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:40.699064 containerd[1478]: time="2025-01-29T11:06:40.699006500Z" level=info msg="Forcibly stopping sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\"" Jan 29 11:06:40.699099 containerd[1478]: time="2025-01-29T11:06:40.699074100Z" level=info msg="TearDown network for sandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" successfully" Jan 29 11:06:40.705120 containerd[1478]: time="2025-01-29T11:06:40.705065615Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.705120 containerd[1478]: time="2025-01-29T11:06:40.705142495Z" level=info msg="RemovePodSandbox \"10b4ecdf1907eb90913caa063ba7e33f360560eccb2f67eeffb3ebe8b756c7b5\" returns successfully" Jan 29 11:06:40.706340 containerd[1478]: time="2025-01-29T11:06:40.705980534Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" Jan 29 11:06:40.706340 containerd[1478]: time="2025-01-29T11:06:40.706181934Z" level=info msg="TearDown network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" successfully" Jan 29 11:06:40.706340 containerd[1478]: time="2025-01-29T11:06:40.706207094Z" level=info msg="StopPodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" returns successfully" Jan 29 11:06:40.706774 containerd[1478]: time="2025-01-29T11:06:40.706557853Z" level=info msg="RemovePodSandbox for \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" Jan 29 11:06:40.706774 containerd[1478]: time="2025-01-29T11:06:40.706587013Z" level=info msg="Forcibly stopping sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\"" Jan 29 11:06:40.706774 containerd[1478]: time="2025-01-29T11:06:40.706651053Z" level=info msg="TearDown network for sandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" successfully" Jan 29 11:06:40.710024 containerd[1478]: time="2025-01-29T11:06:40.709973090Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.710102 containerd[1478]: time="2025-01-29T11:06:40.710046290Z" level=info msg="RemovePodSandbox \"2c9ffe62fcb33c42ec0467db7f9c5ae71676fd8493189138d604f9ba02305840\" returns successfully" Jan 29 11:06:40.710735 containerd[1478]: time="2025-01-29T11:06:40.710500650Z" level=info msg="StopPodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\"" Jan 29 11:06:40.710735 containerd[1478]: time="2025-01-29T11:06:40.710645530Z" level=info msg="TearDown network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" successfully" Jan 29 11:06:40.710735 containerd[1478]: time="2025-01-29T11:06:40.710658130Z" level=info msg="StopPodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" returns successfully" Jan 29 11:06:40.711335 containerd[1478]: time="2025-01-29T11:06:40.711201609Z" level=info msg="RemovePodSandbox for \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\"" Jan 29 11:06:40.711335 containerd[1478]: time="2025-01-29T11:06:40.711228489Z" level=info msg="Forcibly stopping sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\"" Jan 29 11:06:40.711335 containerd[1478]: time="2025-01-29T11:06:40.711297489Z" level=info msg="TearDown network for sandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" successfully" Jan 29 11:06:40.714414 containerd[1478]: time="2025-01-29T11:06:40.714340207Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.714414 containerd[1478]: time="2025-01-29T11:06:40.714408967Z" level=info msg="RemovePodSandbox \"1fddaddeb149499f348ba2620d6b5efb93d78ce03e21db797d4b9f30d9cc3c01\" returns successfully" Jan 29 11:06:40.714799 containerd[1478]: time="2025-01-29T11:06:40.714773726Z" level=info msg="StopPodSandbox for \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\"" Jan 29 11:06:40.714947 containerd[1478]: time="2025-01-29T11:06:40.714892526Z" level=info msg="TearDown network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\" successfully" Jan 29 11:06:40.714947 containerd[1478]: time="2025-01-29T11:06:40.714911046Z" level=info msg="StopPodSandbox for \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\" returns successfully" Jan 29 11:06:40.715240 containerd[1478]: time="2025-01-29T11:06:40.715219806Z" level=info msg="RemovePodSandbox for \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\"" Jan 29 11:06:40.715288 containerd[1478]: time="2025-01-29T11:06:40.715247526Z" level=info msg="Forcibly stopping sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\"" Jan 29 11:06:40.715311 containerd[1478]: time="2025-01-29T11:06:40.715303486Z" level=info msg="TearDown network for sandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\" successfully" Jan 29 11:06:40.718620 containerd[1478]: time="2025-01-29T11:06:40.718368923Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.718620 containerd[1478]: time="2025-01-29T11:06:40.718468363Z" level=info msg="RemovePodSandbox \"64bf9291d58917cf623675872ede0e1c6b5abb62e4fa7688eaa5975e6e14668a\" returns successfully" Jan 29 11:06:40.721592 containerd[1478]: time="2025-01-29T11:06:40.720666041Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:40.721592 containerd[1478]: time="2025-01-29T11:06:40.720901841Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:40.721592 containerd[1478]: time="2025-01-29T11:06:40.720929441Z" level=info msg="StopPodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:40.722320 containerd[1478]: time="2025-01-29T11:06:40.721980480Z" level=info msg="RemovePodSandbox for \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:40.722320 containerd[1478]: time="2025-01-29T11:06:40.722009680Z" level=info msg="Forcibly stopping sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\"" Jan 29 11:06:40.722320 containerd[1478]: time="2025-01-29T11:06:40.722080400Z" level=info msg="TearDown network for sandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" successfully" Jan 29 11:06:40.726206 containerd[1478]: time="2025-01-29T11:06:40.726150076Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.726367 containerd[1478]: time="2025-01-29T11:06:40.726237396Z" level=info msg="RemovePodSandbox \"42bdab4ad76501691fb79e2d39d484ba951248c9818933fbb4007b31de45872b\" returns successfully" Jan 29 11:06:40.727161 containerd[1478]: time="2025-01-29T11:06:40.727129035Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:40.727262 containerd[1478]: time="2025-01-29T11:06:40.727240355Z" level=info msg="TearDown network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" successfully" Jan 29 11:06:40.727262 containerd[1478]: time="2025-01-29T11:06:40.727255955Z" level=info msg="StopPodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" returns successfully" Jan 29 11:06:40.727917 containerd[1478]: time="2025-01-29T11:06:40.727599835Z" level=info msg="RemovePodSandbox for \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:40.727917 containerd[1478]: time="2025-01-29T11:06:40.727638075Z" level=info msg="Forcibly stopping sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\"" Jan 29 11:06:40.727917 containerd[1478]: time="2025-01-29T11:06:40.727719035Z" level=info msg="TearDown network for sandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" successfully" Jan 29 11:06:40.731307 containerd[1478]: time="2025-01-29T11:06:40.731251312Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.731458 containerd[1478]: time="2025-01-29T11:06:40.731439951Z" level=info msg="RemovePodSandbox \"6e73b1f8173129fd49701689f550ecf7e3c67eea95ee1bc909bb9cada1700f3f\" returns successfully" Jan 29 11:06:40.732121 containerd[1478]: time="2025-01-29T11:06:40.732089351Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" Jan 29 11:06:40.733874 containerd[1478]: time="2025-01-29T11:06:40.733325630Z" level=info msg="TearDown network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" successfully" Jan 29 11:06:40.733874 containerd[1478]: time="2025-01-29T11:06:40.733349350Z" level=info msg="StopPodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" returns successfully" Jan 29 11:06:40.734254 containerd[1478]: time="2025-01-29T11:06:40.734223949Z" level=info msg="RemovePodSandbox for \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" Jan 29 11:06:40.734301 containerd[1478]: time="2025-01-29T11:06:40.734259309Z" level=info msg="Forcibly stopping sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\"" Jan 29 11:06:40.734334 containerd[1478]: time="2025-01-29T11:06:40.734319989Z" level=info msg="TearDown network for sandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" successfully" Jan 29 11:06:40.739900 containerd[1478]: time="2025-01-29T11:06:40.739804504Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.740026 containerd[1478]: time="2025-01-29T11:06:40.739919264Z" level=info msg="RemovePodSandbox \"ac18d1e6646b6f35bc7331343b45bc6d4719a3a28511ac397104118b881701c4\" returns successfully" Jan 29 11:06:40.741996 containerd[1478]: time="2025-01-29T11:06:40.741743342Z" level=info msg="StopPodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\"" Jan 29 11:06:40.741996 containerd[1478]: time="2025-01-29T11:06:40.741865302Z" level=info msg="TearDown network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" successfully" Jan 29 11:06:40.741996 containerd[1478]: time="2025-01-29T11:06:40.741876262Z" level=info msg="StopPodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" returns successfully" Jan 29 11:06:40.743227 containerd[1478]: time="2025-01-29T11:06:40.743098861Z" level=info msg="RemovePodSandbox for \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\"" Jan 29 11:06:40.743227 containerd[1478]: time="2025-01-29T11:06:40.743131501Z" level=info msg="Forcibly stopping sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\"" Jan 29 11:06:40.743227 containerd[1478]: time="2025-01-29T11:06:40.743197101Z" level=info msg="TearDown network for sandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" successfully" Jan 29 11:06:40.746852 containerd[1478]: time="2025-01-29T11:06:40.746528338Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.746852 containerd[1478]: time="2025-01-29T11:06:40.746601698Z" level=info msg="RemovePodSandbox \"a3da2ffdd91de8bae895b9c1898df2cef9420d511b420f57e6ae68d59f1d60c7\" returns successfully" Jan 29 11:06:40.747638 containerd[1478]: time="2025-01-29T11:06:40.747458857Z" level=info msg="StopPodSandbox for \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\"" Jan 29 11:06:40.747638 containerd[1478]: time="2025-01-29T11:06:40.747592057Z" level=info msg="TearDown network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\" successfully" Jan 29 11:06:40.747638 containerd[1478]: time="2025-01-29T11:06:40.747605497Z" level=info msg="StopPodSandbox for \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\" returns successfully" Jan 29 11:06:40.748181 containerd[1478]: time="2025-01-29T11:06:40.748143617Z" level=info msg="RemovePodSandbox for \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\"" Jan 29 11:06:40.748181 containerd[1478]: time="2025-01-29T11:06:40.748168937Z" level=info msg="Forcibly stopping sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\"" Jan 29 11:06:40.748274 containerd[1478]: time="2025-01-29T11:06:40.748220057Z" level=info msg="TearDown network for sandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\" successfully" Jan 29 11:06:40.751001 containerd[1478]: time="2025-01-29T11:06:40.750919654Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.751001 containerd[1478]: time="2025-01-29T11:06:40.750978214Z" level=info msg="RemovePodSandbox \"20f9f66f874e107518d98d6db6b6a836fcff5a72de58addf94bef33a39fbd273\" returns successfully" Jan 29 11:06:40.751414 containerd[1478]: time="2025-01-29T11:06:40.751384694Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:40.751503 containerd[1478]: time="2025-01-29T11:06:40.751470814Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:40.751503 containerd[1478]: time="2025-01-29T11:06:40.751486494Z" level=info msg="StopPodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:40.752863 containerd[1478]: time="2025-01-29T11:06:40.752332413Z" level=info msg="RemovePodSandbox for \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:40.752863 containerd[1478]: time="2025-01-29T11:06:40.752365013Z" level=info msg="Forcibly stopping sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\"" Jan 29 11:06:40.752863 containerd[1478]: time="2025-01-29T11:06:40.752433693Z" level=info msg="TearDown network for sandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" successfully" Jan 29 11:06:40.757331 containerd[1478]: time="2025-01-29T11:06:40.757274609Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.757408 containerd[1478]: time="2025-01-29T11:06:40.757347969Z" level=info msg="RemovePodSandbox \"a27077a80fd0273d9d62a9c6cdd69342ca2eba8b207d7b9babfb8928891110c9\" returns successfully" Jan 29 11:06:40.759703 containerd[1478]: time="2025-01-29T11:06:40.758391328Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:40.759703 containerd[1478]: time="2025-01-29T11:06:40.758496568Z" level=info msg="TearDown network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" successfully" Jan 29 11:06:40.759703 containerd[1478]: time="2025-01-29T11:06:40.758506768Z" level=info msg="StopPodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" returns successfully" Jan 29 11:06:40.759703 containerd[1478]: time="2025-01-29T11:06:40.758849767Z" level=info msg="RemovePodSandbox for \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:40.759703 containerd[1478]: time="2025-01-29T11:06:40.758875247Z" level=info msg="Forcibly stopping sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\"" Jan 29 11:06:40.759703 containerd[1478]: time="2025-01-29T11:06:40.758936087Z" level=info msg="TearDown network for sandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" successfully" Jan 29 11:06:40.770068 containerd[1478]: time="2025-01-29T11:06:40.769903918Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.770068 containerd[1478]: time="2025-01-29T11:06:40.770047677Z" level=info msg="RemovePodSandbox \"30dd3991a2bfc12e59b68f4209f95fff72986a8cb1a155a9a625799411058bce\" returns successfully" Jan 29 11:06:40.770853 containerd[1478]: time="2025-01-29T11:06:40.770790437Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" Jan 29 11:06:40.770965 containerd[1478]: time="2025-01-29T11:06:40.770944477Z" level=info msg="TearDown network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" successfully" Jan 29 11:06:40.770965 containerd[1478]: time="2025-01-29T11:06:40.770961237Z" level=info msg="StopPodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" returns successfully" Jan 29 11:06:40.771483 containerd[1478]: time="2025-01-29T11:06:40.771457756Z" level=info msg="RemovePodSandbox for \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" Jan 29 11:06:40.771888 containerd[1478]: time="2025-01-29T11:06:40.771486836Z" level=info msg="Forcibly stopping sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\"" Jan 29 11:06:40.771888 containerd[1478]: time="2025-01-29T11:06:40.771682436Z" level=info msg="TearDown network for sandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" successfully" Jan 29 11:06:40.776766 containerd[1478]: time="2025-01-29T11:06:40.776700432Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.776937 containerd[1478]: time="2025-01-29T11:06:40.776799472Z" level=info msg="RemovePodSandbox \"6ded17baee36e211338948de74ef16e53bc7247199d298703b237193e197d414\" returns successfully" Jan 29 11:06:40.780790 containerd[1478]: time="2025-01-29T11:06:40.780525628Z" level=info msg="StopPodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\"" Jan 29 11:06:40.780790 containerd[1478]: time="2025-01-29T11:06:40.780645188Z" level=info msg="TearDown network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" successfully" Jan 29 11:06:40.780790 containerd[1478]: time="2025-01-29T11:06:40.780656308Z" level=info msg="StopPodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" returns successfully" Jan 29 11:06:40.782449 containerd[1478]: time="2025-01-29T11:06:40.781681467Z" level=info msg="RemovePodSandbox for \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\"" Jan 29 11:06:40.782449 containerd[1478]: time="2025-01-29T11:06:40.781713707Z" level=info msg="Forcibly stopping sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\"" Jan 29 11:06:40.782449 containerd[1478]: time="2025-01-29T11:06:40.781773907Z" level=info msg="TearDown network for sandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" successfully" Jan 29 11:06:40.787722 containerd[1478]: time="2025-01-29T11:06:40.787680982Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.788559 containerd[1478]: time="2025-01-29T11:06:40.787963062Z" level=info msg="RemovePodSandbox \"88129bb58b6f37426da32200b19347f1899e166ce7dabc56a211fb9cfd6c0c63\" returns successfully" Jan 29 11:06:40.788559 containerd[1478]: time="2025-01-29T11:06:40.788233821Z" level=info msg="StopPodSandbox for \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\"" Jan 29 11:06:40.788559 containerd[1478]: time="2025-01-29T11:06:40.788431141Z" level=info msg="TearDown network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\" successfully" Jan 29 11:06:40.788559 containerd[1478]: time="2025-01-29T11:06:40.788444021Z" level=info msg="StopPodSandbox for \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\" returns successfully" Jan 29 11:06:40.790085 containerd[1478]: time="2025-01-29T11:06:40.789574860Z" level=info msg="RemovePodSandbox for \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\"" Jan 29 11:06:40.790319 containerd[1478]: time="2025-01-29T11:06:40.790185340Z" level=info msg="Forcibly stopping sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\"" Jan 29 11:06:40.790319 containerd[1478]: time="2025-01-29T11:06:40.790274700Z" level=info msg="TearDown network for sandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\" successfully" Jan 29 11:06:40.803842 containerd[1478]: time="2025-01-29T11:06:40.803705528Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.803842 containerd[1478]: time="2025-01-29T11:06:40.803781808Z" level=info msg="RemovePodSandbox \"67da082bd9477367e2e642ea8fa69627928471b1a03cdeda5736c6b28bbf8313\" returns successfully" Jan 29 11:06:40.807320 containerd[1478]: time="2025-01-29T11:06:40.807099245Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:40.807320 containerd[1478]: time="2025-01-29T11:06:40.807225925Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:40.807320 containerd[1478]: time="2025-01-29T11:06:40.807236845Z" level=info msg="StopPodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:40.807704 containerd[1478]: time="2025-01-29T11:06:40.807675884Z" level=info msg="RemovePodSandbox for \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:40.807786 containerd[1478]: time="2025-01-29T11:06:40.807707444Z" level=info msg="Forcibly stopping sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\"" Jan 29 11:06:40.808069 containerd[1478]: time="2025-01-29T11:06:40.808039164Z" level=info msg="TearDown network for sandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" successfully" Jan 29 11:06:40.812565 containerd[1478]: time="2025-01-29T11:06:40.812512160Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.812696 containerd[1478]: time="2025-01-29T11:06:40.812591280Z" level=info msg="RemovePodSandbox \"2c9d18a87b2502de3311848fb4c70ec9c86a20537ce59b73750b06ad381bea73\" returns successfully" Jan 29 11:06:40.813242 containerd[1478]: time="2025-01-29T11:06:40.813098160Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:40.813242 containerd[1478]: time="2025-01-29T11:06:40.813187359Z" level=info msg="TearDown network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" successfully" Jan 29 11:06:40.813242 containerd[1478]: time="2025-01-29T11:06:40.813196719Z" level=info msg="StopPodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" returns successfully" Jan 29 11:06:40.814150 containerd[1478]: time="2025-01-29T11:06:40.813724759Z" level=info msg="RemovePodSandbox for \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:40.814150 containerd[1478]: time="2025-01-29T11:06:40.813757759Z" level=info msg="Forcibly stopping sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\"" Jan 29 11:06:40.814150 containerd[1478]: time="2025-01-29T11:06:40.813846679Z" level=info msg="TearDown network for sandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" successfully" Jan 29 11:06:40.818439 containerd[1478]: time="2025-01-29T11:06:40.818366595Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.818603 containerd[1478]: time="2025-01-29T11:06:40.818456675Z" level=info msg="RemovePodSandbox \"e91928337c237c94a8486868ff6701a9ea790079177da7ee0b7b642ce26822e4\" returns successfully" Jan 29 11:06:40.819114 containerd[1478]: time="2025-01-29T11:06:40.819084754Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" Jan 29 11:06:40.819647 containerd[1478]: time="2025-01-29T11:06:40.819384594Z" level=info msg="TearDown network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" successfully" Jan 29 11:06:40.819647 containerd[1478]: time="2025-01-29T11:06:40.819403754Z" level=info msg="StopPodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" returns successfully" Jan 29 11:06:40.819908 containerd[1478]: time="2025-01-29T11:06:40.819776474Z" level=info msg="RemovePodSandbox for \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" Jan 29 11:06:40.819908 containerd[1478]: time="2025-01-29T11:06:40.819847354Z" level=info msg="Forcibly stopping sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\"" Jan 29 11:06:40.820045 containerd[1478]: time="2025-01-29T11:06:40.819947113Z" level=info msg="TearDown network for sandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" successfully" Jan 29 11:06:40.823686 containerd[1478]: time="2025-01-29T11:06:40.823622990Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.824128 containerd[1478]: time="2025-01-29T11:06:40.823704830Z" level=info msg="RemovePodSandbox \"059e56960edd3d480b3f9a25d05c4edbac75d3299118791847d93c6dd89dbd4d\" returns successfully" Jan 29 11:06:40.824441 containerd[1478]: time="2025-01-29T11:06:40.824388870Z" level=info msg="StopPodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\"" Jan 29 11:06:40.824652 containerd[1478]: time="2025-01-29T11:06:40.824488749Z" level=info msg="TearDown network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" successfully" Jan 29 11:06:40.824652 containerd[1478]: time="2025-01-29T11:06:40.824499629Z" level=info msg="StopPodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" returns successfully" Jan 29 11:06:40.825248 containerd[1478]: time="2025-01-29T11:06:40.825193429Z" level=info msg="RemovePodSandbox for \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\"" Jan 29 11:06:40.825248 containerd[1478]: time="2025-01-29T11:06:40.825228149Z" level=info msg="Forcibly stopping sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\"" Jan 29 11:06:40.825372 containerd[1478]: time="2025-01-29T11:06:40.825299989Z" level=info msg="TearDown network for sandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" successfully" Jan 29 11:06:40.828848 containerd[1478]: time="2025-01-29T11:06:40.828716106Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.828848 containerd[1478]: time="2025-01-29T11:06:40.828849706Z" level=info msg="RemovePodSandbox \"fa15d76b0094ccbd4086fbd5da74b8b93addb1919ec90ff738066eb854dac7e8\" returns successfully" Jan 29 11:06:40.829913 containerd[1478]: time="2025-01-29T11:06:40.829635465Z" level=info msg="StopPodSandbox for \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\"" Jan 29 11:06:40.829913 containerd[1478]: time="2025-01-29T11:06:40.829792305Z" level=info msg="TearDown network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\" successfully" Jan 29 11:06:40.829913 containerd[1478]: time="2025-01-29T11:06:40.829809185Z" level=info msg="StopPodSandbox for \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\" returns successfully" Jan 29 11:06:40.830874 containerd[1478]: time="2025-01-29T11:06:40.830627504Z" level=info msg="RemovePodSandbox for \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\"" Jan 29 11:06:40.830874 containerd[1478]: time="2025-01-29T11:06:40.830671464Z" level=info msg="Forcibly stopping sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\"" Jan 29 11:06:40.830874 containerd[1478]: time="2025-01-29T11:06:40.830778624Z" level=info msg="TearDown network for sandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\" successfully" Jan 29 11:06:40.835765 containerd[1478]: time="2025-01-29T11:06:40.835685420Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.835765 containerd[1478]: time="2025-01-29T11:06:40.835767940Z" level=info msg="RemovePodSandbox \"76bdb978b517a695b86b9fc16387a8c4e0b19ba29cc704d1889a394447d1163b\" returns successfully" Jan 29 11:06:40.837020 containerd[1478]: time="2025-01-29T11:06:40.836626379Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:40.837020 containerd[1478]: time="2025-01-29T11:06:40.836839259Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:40.837020 containerd[1478]: time="2025-01-29T11:06:40.836860099Z" level=info msg="StopPodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:40.837281 containerd[1478]: time="2025-01-29T11:06:40.837182938Z" level=info msg="RemovePodSandbox for \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:40.837281 containerd[1478]: time="2025-01-29T11:06:40.837209138Z" level=info msg="Forcibly stopping sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\"" Jan 29 11:06:40.837281 containerd[1478]: time="2025-01-29T11:06:40.837274658Z" level=info msg="TearDown network for sandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" successfully" Jan 29 11:06:40.841208 containerd[1478]: time="2025-01-29T11:06:40.841137575Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.841390 containerd[1478]: time="2025-01-29T11:06:40.841233415Z" level=info msg="RemovePodSandbox \"b59e26947ffe65176a29ed7fcb5c00d9818cb361c7485e13d4a0e07b9910611b\" returns successfully" Jan 29 11:06:40.842144 containerd[1478]: time="2025-01-29T11:06:40.842115894Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:40.842473 containerd[1478]: time="2025-01-29T11:06:40.842350894Z" level=info msg="TearDown network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" successfully" Jan 29 11:06:40.842473 containerd[1478]: time="2025-01-29T11:06:40.842369814Z" level=info msg="StopPodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" returns successfully" Jan 29 11:06:40.844390 containerd[1478]: time="2025-01-29T11:06:40.842863813Z" level=info msg="RemovePodSandbox for \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:40.844390 containerd[1478]: time="2025-01-29T11:06:40.842895813Z" level=info msg="Forcibly stopping sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\"" Jan 29 11:06:40.844390 containerd[1478]: time="2025-01-29T11:06:40.842968093Z" level=info msg="TearDown network for sandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" successfully" Jan 29 11:06:40.847842 containerd[1478]: time="2025-01-29T11:06:40.847778649Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.848035 containerd[1478]: time="2025-01-29T11:06:40.848016209Z" level=info msg="RemovePodSandbox \"7dd20d349fb3f6a7ea19f57d04517b1c25fd06ea5266a05d23159cff334001d1\" returns successfully" Jan 29 11:06:40.848786 containerd[1478]: time="2025-01-29T11:06:40.848745168Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" Jan 29 11:06:40.849362 containerd[1478]: time="2025-01-29T11:06:40.849330968Z" level=info msg="TearDown network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" successfully" Jan 29 11:06:40.849519 containerd[1478]: time="2025-01-29T11:06:40.849491647Z" level=info msg="StopPodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" returns successfully" Jan 29 11:06:40.850310 containerd[1478]: time="2025-01-29T11:06:40.850209167Z" level=info msg="RemovePodSandbox for \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" Jan 29 11:06:40.850310 containerd[1478]: time="2025-01-29T11:06:40.850310047Z" level=info msg="Forcibly stopping sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\"" Jan 29 11:06:40.850477 containerd[1478]: time="2025-01-29T11:06:40.850402287Z" level=info msg="TearDown network for sandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" successfully" Jan 29 11:06:40.856109 containerd[1478]: time="2025-01-29T11:06:40.856034522Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.856280 containerd[1478]: time="2025-01-29T11:06:40.856134162Z" level=info msg="RemovePodSandbox \"09378607ebf0fce9fd89b2a62e040ce6d770a5b8244ed9c7b3661954496cdee9\" returns successfully" Jan 29 11:06:40.856958 containerd[1478]: time="2025-01-29T11:06:40.856716401Z" level=info msg="StopPodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\"" Jan 29 11:06:40.856958 containerd[1478]: time="2025-01-29T11:06:40.856868081Z" level=info msg="TearDown network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" successfully" Jan 29 11:06:40.856958 containerd[1478]: time="2025-01-29T11:06:40.856882641Z" level=info msg="StopPodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" returns successfully" Jan 29 11:06:40.857482 containerd[1478]: time="2025-01-29T11:06:40.857448880Z" level=info msg="RemovePodSandbox for \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\"" Jan 29 11:06:40.857651 containerd[1478]: time="2025-01-29T11:06:40.857488560Z" level=info msg="Forcibly stopping sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\"" Jan 29 11:06:40.857651 containerd[1478]: time="2025-01-29T11:06:40.857582400Z" level=info msg="TearDown network for sandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" successfully" Jan 29 11:06:40.862102 containerd[1478]: time="2025-01-29T11:06:40.862052436Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.862207 containerd[1478]: time="2025-01-29T11:06:40.862132956Z" level=info msg="RemovePodSandbox \"8d8cc9c79d40fe24250f44dceaa68c5039f513c0b16c76e68ff1356e1f7df768\" returns successfully" Jan 29 11:06:40.862669 containerd[1478]: time="2025-01-29T11:06:40.862635476Z" level=info msg="StopPodSandbox for \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\"" Jan 29 11:06:40.862761 containerd[1478]: time="2025-01-29T11:06:40.862743916Z" level=info msg="TearDown network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\" successfully" Jan 29 11:06:40.862836 containerd[1478]: time="2025-01-29T11:06:40.862759556Z" level=info msg="StopPodSandbox for \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\" returns successfully" Jan 29 11:06:40.863432 containerd[1478]: time="2025-01-29T11:06:40.863120315Z" level=info msg="RemovePodSandbox for \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\"" Jan 29 11:06:40.863432 containerd[1478]: time="2025-01-29T11:06:40.863150995Z" level=info msg="Forcibly stopping sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\"" Jan 29 11:06:40.863432 containerd[1478]: time="2025-01-29T11:06:40.863216075Z" level=info msg="TearDown network for sandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\" successfully" Jan 29 11:06:40.868786 containerd[1478]: time="2025-01-29T11:06:40.868696351Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.868786 containerd[1478]: time="2025-01-29T11:06:40.868778990Z" level=info msg="RemovePodSandbox \"384f9ad68c37b496c03db3e4ba0c672052972e974fb04fb507fe4358465804af\" returns successfully" Jan 29 11:06:40.869752 containerd[1478]: time="2025-01-29T11:06:40.869497350Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:40.869752 containerd[1478]: time="2025-01-29T11:06:40.869619350Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:40.869752 containerd[1478]: time="2025-01-29T11:06:40.869650710Z" level=info msg="StopPodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:40.870521 containerd[1478]: time="2025-01-29T11:06:40.870141029Z" level=info msg="RemovePodSandbox for \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:40.870521 containerd[1478]: time="2025-01-29T11:06:40.870173109Z" level=info msg="Forcibly stopping sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\"" Jan 29 11:06:40.870521 containerd[1478]: time="2025-01-29T11:06:40.870241589Z" level=info msg="TearDown network for sandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" successfully" Jan 29 11:06:40.884046 containerd[1478]: time="2025-01-29T11:06:40.883999817Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.884277 containerd[1478]: time="2025-01-29T11:06:40.884257657Z" level=info msg="RemovePodSandbox \"92b573e3ca97a86c25417f8ffb8bfef554cfa9e71be9e73b530876415e092d4a\" returns successfully" Jan 29 11:06:40.885175 containerd[1478]: time="2025-01-29T11:06:40.884985576Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:40.885175 containerd[1478]: time="2025-01-29T11:06:40.885103256Z" level=info msg="TearDown network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" successfully" Jan 29 11:06:40.885175 containerd[1478]: time="2025-01-29T11:06:40.885114656Z" level=info msg="StopPodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" returns successfully" Jan 29 11:06:40.885851 containerd[1478]: time="2025-01-29T11:06:40.885481136Z" level=info msg="RemovePodSandbox for \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:40.885851 containerd[1478]: time="2025-01-29T11:06:40.885505256Z" level=info msg="Forcibly stopping sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\"" Jan 29 11:06:40.885851 containerd[1478]: time="2025-01-29T11:06:40.885613376Z" level=info msg="TearDown network for sandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" successfully" Jan 29 11:06:40.906326 containerd[1478]: time="2025-01-29T11:06:40.906264597Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.906656 containerd[1478]: time="2025-01-29T11:06:40.906620157Z" level=info msg="RemovePodSandbox \"ba32e467d993f9b2047c763625b3b0a00d0a102f3a165900bb7030431c0449a1\" returns successfully" Jan 29 11:06:40.909521 containerd[1478]: time="2025-01-29T11:06:40.909066595Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" Jan 29 11:06:40.909521 containerd[1478]: time="2025-01-29T11:06:40.909200195Z" level=info msg="TearDown network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" successfully" Jan 29 11:06:40.909521 containerd[1478]: time="2025-01-29T11:06:40.909212035Z" level=info msg="StopPodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" returns successfully" Jan 29 11:06:40.909835 containerd[1478]: time="2025-01-29T11:06:40.909592554Z" level=info msg="RemovePodSandbox for \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" Jan 29 11:06:40.909835 containerd[1478]: time="2025-01-29T11:06:40.909618154Z" level=info msg="Forcibly stopping sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\"" Jan 29 11:06:40.909835 containerd[1478]: time="2025-01-29T11:06:40.909678634Z" level=info msg="TearDown network for sandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" successfully" Jan 29 11:06:40.913272 containerd[1478]: time="2025-01-29T11:06:40.913206711Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.913398 containerd[1478]: time="2025-01-29T11:06:40.913321711Z" level=info msg="RemovePodSandbox \"8a526f6607c75ca37900f7ba549a5e15a50585de978afc10f55025fffb2d558c\" returns successfully" Jan 29 11:06:40.914480 containerd[1478]: time="2025-01-29T11:06:40.914034511Z" level=info msg="StopPodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\"" Jan 29 11:06:40.914480 containerd[1478]: time="2025-01-29T11:06:40.914161390Z" level=info msg="TearDown network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" successfully" Jan 29 11:06:40.914480 containerd[1478]: time="2025-01-29T11:06:40.914173670Z" level=info msg="StopPodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" returns successfully" Jan 29 11:06:40.914662 containerd[1478]: time="2025-01-29T11:06:40.914571510Z" level=info msg="RemovePodSandbox for \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\"" Jan 29 11:06:40.914662 containerd[1478]: time="2025-01-29T11:06:40.914601230Z" level=info msg="Forcibly stopping sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\"" Jan 29 11:06:40.914709 containerd[1478]: time="2025-01-29T11:06:40.914667750Z" level=info msg="TearDown network for sandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" successfully" Jan 29 11:06:40.917969 containerd[1478]: time="2025-01-29T11:06:40.917911747Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.918192 containerd[1478]: time="2025-01-29T11:06:40.917995627Z" level=info msg="RemovePodSandbox \"1dfc657ab37008185f9fe3daf5384533e566290be5df410c8662c7d6bba58c0b\" returns successfully" Jan 29 11:06:40.919107 containerd[1478]: time="2025-01-29T11:06:40.918905666Z" level=info msg="StopPodSandbox for \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\"" Jan 29 11:06:40.919298 containerd[1478]: time="2025-01-29T11:06:40.919231186Z" level=info msg="TearDown network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\" successfully" Jan 29 11:06:40.919298 containerd[1478]: time="2025-01-29T11:06:40.919253706Z" level=info msg="StopPodSandbox for \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\" returns successfully" Jan 29 11:06:40.920482 containerd[1478]: time="2025-01-29T11:06:40.920433145Z" level=info msg="RemovePodSandbox for \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\"" Jan 29 11:06:40.920609 containerd[1478]: time="2025-01-29T11:06:40.920491945Z" level=info msg="Forcibly stopping sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\"" Jan 29 11:06:40.920687 containerd[1478]: time="2025-01-29T11:06:40.920653145Z" level=info msg="TearDown network for sandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\" successfully" Jan 29 11:06:40.925159 containerd[1478]: time="2025-01-29T11:06:40.925110221Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 11:06:40.925355 containerd[1478]: time="2025-01-29T11:06:40.925195821Z" level=info msg="RemovePodSandbox \"58c5088b73346197daae4fa2b910d15d4a80d6728b82eb8fcee7e273c87bdcc3\" returns successfully" Jan 29 11:07:22.372848 systemd[1]: run-containerd-runc-k8s.io-9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154-runc.YZUIxC.mount: Deactivated successfully. Jan 29 11:07:40.760112 systemd[1]: run-containerd-runc-k8s.io-717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111-runc.9D2YMt.mount: Deactivated successfully. Jan 29 11:08:08.726294 systemd[1]: Started sshd@8-78.46.186.225:22-92.255.85.189:15542.service - OpenSSH per-connection server daemon (92.255.85.189:15542). Jan 29 11:08:09.532282 sshd[5997]: Invalid user ubnt from 92.255.85.189 port 15542 Jan 29 11:08:09.613789 sshd[5997]: Connection closed by invalid user ubnt 92.255.85.189 port 15542 [preauth] Jan 29 11:08:09.617488 systemd[1]: sshd@8-78.46.186.225:22-92.255.85.189:15542.service: Deactivated successfully. Jan 29 11:09:01.882835 systemd[1]: run-containerd-runc-k8s.io-717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111-runc.o6QuQq.mount: Deactivated successfully. Jan 29 11:10:32.376085 systemd[1]: Started sshd@9-78.46.186.225:22-147.75.109.163:49228.service - OpenSSH per-connection server daemon (147.75.109.163:49228). Jan 29 11:10:33.372630 sshd[6293]: Accepted publickey for core from 147.75.109.163 port 49228 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:10:33.374953 sshd-session[6293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:10:33.381157 systemd-logind[1458]: New session 8 of user core. Jan 29 11:10:33.390031 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 29 11:10:34.143499 sshd[6295]: Connection closed by 147.75.109.163 port 49228 Jan 29 11:10:34.144369 sshd-session[6293]: pam_unix(sshd:session): session closed for user core Jan 29 11:10:34.149345 systemd-logind[1458]: Session 8 logged out. Waiting for processes to exit. Jan 29 11:10:34.149607 systemd[1]: sshd@9-78.46.186.225:22-147.75.109.163:49228.service: Deactivated successfully. Jan 29 11:10:34.151947 systemd[1]: session-8.scope: Deactivated successfully. Jan 29 11:10:34.154544 systemd-logind[1458]: Removed session 8. Jan 29 11:10:39.320599 systemd[1]: Started sshd@10-78.46.186.225:22-147.75.109.163:35346.service - OpenSSH per-connection server daemon (147.75.109.163:35346). Jan 29 11:10:40.308916 sshd[6308]: Accepted publickey for core from 147.75.109.163 port 35346 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:10:40.311009 sshd-session[6308]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:10:40.317186 systemd-logind[1458]: New session 9 of user core. Jan 29 11:10:40.323144 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 29 11:10:41.066744 sshd[6310]: Connection closed by 147.75.109.163 port 35346 Jan 29 11:10:41.067616 sshd-session[6308]: pam_unix(sshd:session): session closed for user core Jan 29 11:10:41.071224 systemd[1]: sshd@10-78.46.186.225:22-147.75.109.163:35346.service: Deactivated successfully. Jan 29 11:10:41.073764 systemd[1]: session-9.scope: Deactivated successfully. Jan 29 11:10:41.075382 systemd-logind[1458]: Session 9 logged out. Waiting for processes to exit. Jan 29 11:10:41.076794 systemd-logind[1458]: Removed session 9. Jan 29 11:10:46.247750 systemd[1]: Started sshd@11-78.46.186.225:22-147.75.109.163:35360.service - OpenSSH per-connection server daemon (147.75.109.163:35360). Jan 29 11:10:47.254252 sshd[6342]: Accepted publickey for core from 147.75.109.163 port 35360 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:10:47.256665 sshd-session[6342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:10:47.261962 systemd-logind[1458]: New session 10 of user core. Jan 29 11:10:47.269310 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 29 11:10:48.014626 sshd[6344]: Connection closed by 147.75.109.163 port 35360 Jan 29 11:10:48.015348 sshd-session[6342]: pam_unix(sshd:session): session closed for user core Jan 29 11:10:48.021066 systemd[1]: sshd@11-78.46.186.225:22-147.75.109.163:35360.service: Deactivated successfully. Jan 29 11:10:48.023997 systemd[1]: session-10.scope: Deactivated successfully. Jan 29 11:10:48.024978 systemd-logind[1458]: Session 10 logged out. Waiting for processes to exit. Jan 29 11:10:48.026802 systemd-logind[1458]: Removed session 10. Jan 29 11:10:53.195208 systemd[1]: Started sshd@12-78.46.186.225:22-147.75.109.163:35668.service - OpenSSH per-connection server daemon (147.75.109.163:35668). Jan 29 11:10:54.183905 sshd[6378]: Accepted publickey for core from 147.75.109.163 port 35668 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:10:54.186458 sshd-session[6378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:10:54.191896 systemd-logind[1458]: New session 11 of user core. Jan 29 11:10:54.198133 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 29 11:10:54.945966 sshd[6380]: Connection closed by 147.75.109.163 port 35668 Jan 29 11:10:54.946383 sshd-session[6378]: pam_unix(sshd:session): session closed for user core Jan 29 11:10:54.951881 systemd[1]: sshd@12-78.46.186.225:22-147.75.109.163:35668.service: Deactivated successfully. Jan 29 11:10:54.955299 systemd[1]: session-11.scope: Deactivated successfully. Jan 29 11:10:54.956786 systemd-logind[1458]: Session 11 logged out. Waiting for processes to exit. Jan 29 11:10:54.958425 systemd-logind[1458]: Removed session 11. Jan 29 11:11:00.124256 systemd[1]: Started sshd@13-78.46.186.225:22-147.75.109.163:42662.service - OpenSSH per-connection server daemon (147.75.109.163:42662). Jan 29 11:11:01.118931 sshd[6393]: Accepted publickey for core from 147.75.109.163 port 42662 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:01.122004 sshd-session[6393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:01.132639 systemd-logind[1458]: New session 12 of user core. Jan 29 11:11:01.135176 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 29 11:11:01.880788 sshd[6395]: Connection closed by 147.75.109.163 port 42662 Jan 29 11:11:01.881450 sshd-session[6393]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:01.885883 systemd[1]: sshd@13-78.46.186.225:22-147.75.109.163:42662.service: Deactivated successfully. Jan 29 11:11:01.887926 systemd[1]: session-12.scope: Deactivated successfully. Jan 29 11:11:01.891273 systemd-logind[1458]: Session 12 logged out. Waiting for processes to exit. Jan 29 11:11:01.893076 systemd-logind[1458]: Removed session 12. Jan 29 11:11:07.056417 systemd[1]: Started sshd@14-78.46.186.225:22-147.75.109.163:42678.service - OpenSSH per-connection server daemon (147.75.109.163:42678). Jan 29 11:11:08.041861 sshd[6431]: Accepted publickey for core from 147.75.109.163 port 42678 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:08.043881 sshd-session[6431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:08.048613 systemd-logind[1458]: New session 13 of user core. Jan 29 11:11:08.053036 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 29 11:11:08.801079 sshd[6433]: Connection closed by 147.75.109.163 port 42678 Jan 29 11:11:08.802053 sshd-session[6431]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:08.806724 systemd[1]: sshd@14-78.46.186.225:22-147.75.109.163:42678.service: Deactivated successfully. Jan 29 11:11:08.810954 systemd[1]: session-13.scope: Deactivated successfully. Jan 29 11:11:08.813333 systemd-logind[1458]: Session 13 logged out. Waiting for processes to exit. Jan 29 11:11:08.814516 systemd-logind[1458]: Removed session 13. Jan 29 11:11:10.757544 systemd[1]: run-containerd-runc-k8s.io-717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111-runc.9qt4st.mount: Deactivated successfully. Jan 29 11:11:13.973255 systemd[1]: Started sshd@15-78.46.186.225:22-147.75.109.163:57642.service - OpenSSH per-connection server daemon (147.75.109.163:57642). Jan 29 11:11:14.942492 sshd[6466]: Accepted publickey for core from 147.75.109.163 port 57642 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:14.944785 sshd-session[6466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:14.950843 systemd-logind[1458]: New session 14 of user core. Jan 29 11:11:14.959289 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 29 11:11:15.693527 sshd[6480]: Connection closed by 147.75.109.163 port 57642 Jan 29 11:11:15.696345 sshd-session[6466]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:15.700440 systemd[1]: sshd@15-78.46.186.225:22-147.75.109.163:57642.service: Deactivated successfully. Jan 29 11:11:15.705880 systemd[1]: session-14.scope: Deactivated successfully. Jan 29 11:11:15.706783 systemd-logind[1458]: Session 14 logged out. Waiting for processes to exit. Jan 29 11:11:15.707974 systemd-logind[1458]: Removed session 14. Jan 29 11:11:20.872276 systemd[1]: Started sshd@16-78.46.186.225:22-147.75.109.163:49368.service - OpenSSH per-connection server daemon (147.75.109.163:49368). Jan 29 11:11:21.852978 sshd[6493]: Accepted publickey for core from 147.75.109.163 port 49368 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:21.855032 sshd-session[6493]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:21.861059 systemd-logind[1458]: New session 15 of user core. Jan 29 11:11:21.867020 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 29 11:11:22.382135 systemd[1]: run-containerd-runc-k8s.io-9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154-runc.FFFNQ5.mount: Deactivated successfully. Jan 29 11:11:22.621325 sshd[6495]: Connection closed by 147.75.109.163 port 49368 Jan 29 11:11:22.622301 sshd-session[6493]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:22.626244 systemd[1]: sshd@16-78.46.186.225:22-147.75.109.163:49368.service: Deactivated successfully. Jan 29 11:11:22.628456 systemd[1]: session-15.scope: Deactivated successfully. Jan 29 11:11:22.630198 systemd-logind[1458]: Session 15 logged out. Waiting for processes to exit. Jan 29 11:11:22.631572 systemd-logind[1458]: Removed session 15. Jan 29 11:11:27.803137 systemd[1]: Started sshd@17-78.46.186.225:22-147.75.109.163:52788.service - OpenSSH per-connection server daemon (147.75.109.163:52788). Jan 29 11:11:28.798630 sshd[6530]: Accepted publickey for core from 147.75.109.163 port 52788 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:28.800805 sshd-session[6530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:28.805979 systemd-logind[1458]: New session 16 of user core. Jan 29 11:11:28.812290 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 29 11:11:29.555498 sshd[6532]: Connection closed by 147.75.109.163 port 52788 Jan 29 11:11:29.556615 sshd-session[6530]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:29.563793 systemd[1]: sshd@17-78.46.186.225:22-147.75.109.163:52788.service: Deactivated successfully. Jan 29 11:11:29.563839 systemd-logind[1458]: Session 16 logged out. Waiting for processes to exit. Jan 29 11:11:29.567250 systemd[1]: session-16.scope: Deactivated successfully. Jan 29 11:11:29.568754 systemd-logind[1458]: Removed session 16. Jan 29 11:11:34.733220 systemd[1]: Started sshd@18-78.46.186.225:22-147.75.109.163:52798.service - OpenSSH per-connection server daemon (147.75.109.163:52798). Jan 29 11:11:35.715620 sshd[6544]: Accepted publickey for core from 147.75.109.163 port 52798 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:35.718092 sshd-session[6544]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:35.724251 systemd-logind[1458]: New session 17 of user core. Jan 29 11:11:35.730059 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 29 11:11:36.485902 sshd[6546]: Connection closed by 147.75.109.163 port 52798 Jan 29 11:11:36.486751 sshd-session[6544]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:36.491166 systemd[1]: sshd@18-78.46.186.225:22-147.75.109.163:52798.service: Deactivated successfully. Jan 29 11:11:36.494541 systemd[1]: session-17.scope: Deactivated successfully. Jan 29 11:11:36.496679 systemd-logind[1458]: Session 17 logged out. Waiting for processes to exit. Jan 29 11:11:36.497738 systemd-logind[1458]: Removed session 17. Jan 29 11:11:41.658163 systemd[1]: Started sshd@19-78.46.186.225:22-147.75.109.163:42146.service - OpenSSH per-connection server daemon (147.75.109.163:42146). Jan 29 11:11:42.634677 sshd[6580]: Accepted publickey for core from 147.75.109.163 port 42146 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:42.636685 sshd-session[6580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:42.642237 systemd-logind[1458]: New session 18 of user core. Jan 29 11:11:42.648202 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 29 11:11:43.383711 sshd[6582]: Connection closed by 147.75.109.163 port 42146 Jan 29 11:11:43.384574 sshd-session[6580]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:43.389643 systemd[1]: sshd@19-78.46.186.225:22-147.75.109.163:42146.service: Deactivated successfully. Jan 29 11:11:43.395301 systemd[1]: session-18.scope: Deactivated successfully. Jan 29 11:11:43.396392 systemd-logind[1458]: Session 18 logged out. Waiting for processes to exit. Jan 29 11:11:43.398851 systemd-logind[1458]: Removed session 18. Jan 29 11:11:48.563587 systemd[1]: Started sshd@20-78.46.186.225:22-147.75.109.163:60592.service - OpenSSH per-connection server daemon (147.75.109.163:60592). Jan 29 11:11:49.547747 sshd[6602]: Accepted publickey for core from 147.75.109.163 port 60592 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:49.550158 sshd-session[6602]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:49.555564 systemd-logind[1458]: New session 19 of user core. Jan 29 11:11:49.560969 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 29 11:11:50.317706 sshd[6604]: Connection closed by 147.75.109.163 port 60592 Jan 29 11:11:50.319412 sshd-session[6602]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:50.324510 systemd-logind[1458]: Session 19 logged out. Waiting for processes to exit. Jan 29 11:11:50.325176 systemd[1]: sshd@20-78.46.186.225:22-147.75.109.163:60592.service: Deactivated successfully. Jan 29 11:11:50.327690 systemd[1]: session-19.scope: Deactivated successfully. Jan 29 11:11:50.330281 systemd-logind[1458]: Removed session 19. Jan 29 11:11:55.494090 systemd[1]: Started sshd@21-78.46.186.225:22-147.75.109.163:60594.service - OpenSSH per-connection server daemon (147.75.109.163:60594). Jan 29 11:11:56.470551 sshd[6639]: Accepted publickey for core from 147.75.109.163 port 60594 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:11:56.472998 sshd-session[6639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:11:56.479155 systemd-logind[1458]: New session 20 of user core. Jan 29 11:11:56.484031 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 29 11:11:57.220419 sshd[6641]: Connection closed by 147.75.109.163 port 60594 Jan 29 11:11:57.221401 sshd-session[6639]: pam_unix(sshd:session): session closed for user core Jan 29 11:11:57.226882 systemd-logind[1458]: Session 20 logged out. Waiting for processes to exit. Jan 29 11:11:57.227806 systemd[1]: sshd@21-78.46.186.225:22-147.75.109.163:60594.service: Deactivated successfully. Jan 29 11:11:57.230206 systemd[1]: session-20.scope: Deactivated successfully. Jan 29 11:11:57.232202 systemd-logind[1458]: Removed session 20. Jan 29 11:12:02.394073 systemd[1]: Started sshd@22-78.46.186.225:22-147.75.109.163:34708.service - OpenSSH per-connection server daemon (147.75.109.163:34708). Jan 29 11:12:03.389874 sshd[6675]: Accepted publickey for core from 147.75.109.163 port 34708 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:03.392091 sshd-session[6675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:03.398122 systemd-logind[1458]: New session 21 of user core. Jan 29 11:12:03.401007 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 29 11:12:04.154888 sshd[6677]: Connection closed by 147.75.109.163 port 34708 Jan 29 11:12:04.155733 sshd-session[6675]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:04.161729 systemd[1]: sshd@22-78.46.186.225:22-147.75.109.163:34708.service: Deactivated successfully. Jan 29 11:12:04.165263 systemd[1]: session-21.scope: Deactivated successfully. Jan 29 11:12:04.166016 systemd-logind[1458]: Session 21 logged out. Waiting for processes to exit. Jan 29 11:12:04.167029 systemd-logind[1458]: Removed session 21. Jan 29 11:12:09.328563 systemd[1]: Started sshd@23-78.46.186.225:22-147.75.109.163:60672.service - OpenSSH per-connection server daemon (147.75.109.163:60672). Jan 29 11:12:10.323742 sshd[6689]: Accepted publickey for core from 147.75.109.163 port 60672 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:10.326320 sshd-session[6689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:10.335903 systemd-logind[1458]: New session 22 of user core. Jan 29 11:12:10.342138 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 29 11:12:11.075915 sshd[6691]: Connection closed by 147.75.109.163 port 60672 Jan 29 11:12:11.076938 sshd-session[6689]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:11.081512 systemd[1]: sshd@23-78.46.186.225:22-147.75.109.163:60672.service: Deactivated successfully. Jan 29 11:12:11.084255 systemd[1]: session-22.scope: Deactivated successfully. Jan 29 11:12:11.087451 systemd-logind[1458]: Session 22 logged out. Waiting for processes to exit. Jan 29 11:12:11.089308 systemd-logind[1458]: Removed session 22. Jan 29 11:12:16.249054 systemd[1]: Started sshd@24-78.46.186.225:22-147.75.109.163:60688.service - OpenSSH per-connection server daemon (147.75.109.163:60688). Jan 29 11:12:17.230154 sshd[6722]: Accepted publickey for core from 147.75.109.163 port 60688 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:17.232999 sshd-session[6722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:17.237983 systemd-logind[1458]: New session 23 of user core. Jan 29 11:12:17.243969 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 29 11:12:17.983117 sshd[6724]: Connection closed by 147.75.109.163 port 60688 Jan 29 11:12:17.983636 sshd-session[6722]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:17.988032 systemd[1]: sshd@24-78.46.186.225:22-147.75.109.163:60688.service: Deactivated successfully. Jan 29 11:12:17.990538 systemd[1]: session-23.scope: Deactivated successfully. Jan 29 11:12:17.991461 systemd-logind[1458]: Session 23 logged out. Waiting for processes to exit. Jan 29 11:12:17.992398 systemd-logind[1458]: Removed session 23. Jan 29 11:12:23.159282 systemd[1]: Started sshd@25-78.46.186.225:22-147.75.109.163:34382.service - OpenSSH per-connection server daemon (147.75.109.163:34382). Jan 29 11:12:24.133705 sshd[6757]: Accepted publickey for core from 147.75.109.163 port 34382 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:24.135609 sshd-session[6757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:24.142836 systemd-logind[1458]: New session 24 of user core. Jan 29 11:12:24.153410 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 29 11:12:24.884720 sshd[6759]: Connection closed by 147.75.109.163 port 34382 Jan 29 11:12:24.885362 sshd-session[6757]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:24.891478 systemd-logind[1458]: Session 24 logged out. Waiting for processes to exit. Jan 29 11:12:24.892245 systemd[1]: sshd@25-78.46.186.225:22-147.75.109.163:34382.service: Deactivated successfully. Jan 29 11:12:24.894715 systemd[1]: session-24.scope: Deactivated successfully. Jan 29 11:12:24.895714 systemd-logind[1458]: Removed session 24. Jan 29 11:12:30.064229 systemd[1]: Started sshd@26-78.46.186.225:22-147.75.109.163:59686.service - OpenSSH per-connection server daemon (147.75.109.163:59686). Jan 29 11:12:31.059928 sshd[6773]: Accepted publickey for core from 147.75.109.163 port 59686 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:31.061953 sshd-session[6773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:31.066647 systemd-logind[1458]: New session 25 of user core. Jan 29 11:12:31.071969 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 29 11:12:31.825783 sshd[6775]: Connection closed by 147.75.109.163 port 59686 Jan 29 11:12:31.826397 sshd-session[6773]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:31.832273 systemd[1]: sshd@26-78.46.186.225:22-147.75.109.163:59686.service: Deactivated successfully. Jan 29 11:12:31.836522 systemd[1]: session-25.scope: Deactivated successfully. Jan 29 11:12:31.837576 systemd-logind[1458]: Session 25 logged out. Waiting for processes to exit. Jan 29 11:12:31.838488 systemd-logind[1458]: Removed session 25. Jan 29 11:12:36.999292 systemd[1]: Started sshd@27-78.46.186.225:22-147.75.109.163:59690.service - OpenSSH per-connection server daemon (147.75.109.163:59690). Jan 29 11:12:37.983898 sshd[6792]: Accepted publickey for core from 147.75.109.163 port 59690 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:37.986333 sshd-session[6792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:37.990891 systemd-logind[1458]: New session 26 of user core. Jan 29 11:12:38.001131 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 29 11:12:38.735722 sshd[6794]: Connection closed by 147.75.109.163 port 59690 Jan 29 11:12:38.735554 sshd-session[6792]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:38.740760 systemd[1]: sshd@27-78.46.186.225:22-147.75.109.163:59690.service: Deactivated successfully. Jan 29 11:12:38.743113 systemd[1]: session-26.scope: Deactivated successfully. Jan 29 11:12:38.746081 systemd-logind[1458]: Session 26 logged out. Waiting for processes to exit. Jan 29 11:12:38.748015 systemd-logind[1458]: Removed session 26. Jan 29 11:12:43.915182 systemd[1]: Started sshd@28-78.46.186.225:22-147.75.109.163:60726.service - OpenSSH per-connection server daemon (147.75.109.163:60726). Jan 29 11:12:44.909452 sshd[6827]: Accepted publickey for core from 147.75.109.163 port 60726 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:44.911433 sshd-session[6827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:44.916605 systemd-logind[1458]: New session 27 of user core. Jan 29 11:12:44.923135 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 29 11:12:45.670361 sshd[6829]: Connection closed by 147.75.109.163 port 60726 Jan 29 11:12:45.671259 sshd-session[6827]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:45.675587 systemd[1]: sshd@28-78.46.186.225:22-147.75.109.163:60726.service: Deactivated successfully. Jan 29 11:12:45.679698 systemd[1]: session-27.scope: Deactivated successfully. Jan 29 11:12:45.683040 systemd-logind[1458]: Session 27 logged out. Waiting for processes to exit. Jan 29 11:12:45.684475 systemd-logind[1458]: Removed session 27. Jan 29 11:12:50.852105 systemd[1]: Started sshd@29-78.46.186.225:22-147.75.109.163:33944.service - OpenSSH per-connection server daemon (147.75.109.163:33944). Jan 29 11:12:51.835725 sshd[6842]: Accepted publickey for core from 147.75.109.163 port 33944 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:51.837968 sshd-session[6842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:51.843418 systemd-logind[1458]: New session 28 of user core. Jan 29 11:12:51.850109 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 29 11:12:52.615910 sshd[6856]: Connection closed by 147.75.109.163 port 33944 Jan 29 11:12:52.615731 sshd-session[6842]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:52.620521 systemd-logind[1458]: Session 28 logged out. Waiting for processes to exit. Jan 29 11:12:52.621210 systemd[1]: sshd@29-78.46.186.225:22-147.75.109.163:33944.service: Deactivated successfully. Jan 29 11:12:52.623761 systemd[1]: session-28.scope: Deactivated successfully. Jan 29 11:12:52.626474 systemd-logind[1458]: Removed session 28. Jan 29 11:12:57.789092 systemd[1]: Started sshd@30-78.46.186.225:22-147.75.109.163:35532.service - OpenSSH per-connection server daemon (147.75.109.163:35532). Jan 29 11:12:58.801843 sshd[6894]: Accepted publickey for core from 147.75.109.163 port 35532 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:12:58.804756 sshd-session[6894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:12:58.810916 systemd-logind[1458]: New session 29 of user core. Jan 29 11:12:58.817143 systemd[1]: Started session-29.scope - Session 29 of User core. Jan 29 11:12:59.564100 sshd[6896]: Connection closed by 147.75.109.163 port 35532 Jan 29 11:12:59.564631 sshd-session[6894]: pam_unix(sshd:session): session closed for user core Jan 29 11:12:59.568806 systemd-logind[1458]: Session 29 logged out. Waiting for processes to exit. Jan 29 11:12:59.570155 systemd[1]: sshd@30-78.46.186.225:22-147.75.109.163:35532.service: Deactivated successfully. Jan 29 11:12:59.573656 systemd[1]: session-29.scope: Deactivated successfully. Jan 29 11:12:59.574924 systemd-logind[1458]: Removed session 29. Jan 29 11:13:04.745233 systemd[1]: Started sshd@31-78.46.186.225:22-147.75.109.163:35538.service - OpenSSH per-connection server daemon (147.75.109.163:35538). Jan 29 11:13:05.717143 sshd[6925]: Accepted publickey for core from 147.75.109.163 port 35538 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:05.719092 sshd-session[6925]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:05.725490 systemd-logind[1458]: New session 30 of user core. Jan 29 11:13:05.732199 systemd[1]: Started session-30.scope - Session 30 of User core. Jan 29 11:13:06.466424 sshd[6927]: Connection closed by 147.75.109.163 port 35538 Jan 29 11:13:06.466250 sshd-session[6925]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:06.470992 systemd[1]: session-30.scope: Deactivated successfully. Jan 29 11:13:06.473329 systemd[1]: sshd@31-78.46.186.225:22-147.75.109.163:35538.service: Deactivated successfully. Jan 29 11:13:06.479047 systemd-logind[1458]: Session 30 logged out. Waiting for processes to exit. Jan 29 11:13:06.480437 systemd-logind[1458]: Removed session 30. Jan 29 11:13:11.644152 systemd[1]: Started sshd@32-78.46.186.225:22-147.75.109.163:58276.service - OpenSSH per-connection server daemon (147.75.109.163:58276). Jan 29 11:13:12.637574 sshd[6959]: Accepted publickey for core from 147.75.109.163 port 58276 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:12.639214 sshd-session[6959]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:12.644868 systemd-logind[1458]: New session 31 of user core. Jan 29 11:13:12.652131 systemd[1]: Started session-31.scope - Session 31 of User core. Jan 29 11:13:13.394849 sshd[6963]: Connection closed by 147.75.109.163 port 58276 Jan 29 11:13:13.395786 sshd-session[6959]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:13.400238 systemd[1]: sshd@32-78.46.186.225:22-147.75.109.163:58276.service: Deactivated successfully. Jan 29 11:13:13.402610 systemd[1]: session-31.scope: Deactivated successfully. Jan 29 11:13:13.405613 systemd-logind[1458]: Session 31 logged out. Waiting for processes to exit. Jan 29 11:13:13.406639 systemd-logind[1458]: Removed session 31. Jan 29 11:13:18.572091 systemd[1]: Started sshd@33-78.46.186.225:22-147.75.109.163:57026.service - OpenSSH per-connection server daemon (147.75.109.163:57026). Jan 29 11:13:19.567336 sshd[6974]: Accepted publickey for core from 147.75.109.163 port 57026 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:19.568348 sshd-session[6974]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:19.575117 systemd-logind[1458]: New session 32 of user core. Jan 29 11:13:19.580002 systemd[1]: Started session-32.scope - Session 32 of User core. Jan 29 11:13:20.322186 sshd[6976]: Connection closed by 147.75.109.163 port 57026 Jan 29 11:13:20.322069 sshd-session[6974]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:20.326579 systemd[1]: sshd@33-78.46.186.225:22-147.75.109.163:57026.service: Deactivated successfully. Jan 29 11:13:20.329265 systemd[1]: session-32.scope: Deactivated successfully. Jan 29 11:13:20.330887 systemd-logind[1458]: Session 32 logged out. Waiting for processes to exit. Jan 29 11:13:20.332554 systemd-logind[1458]: Removed session 32. Jan 29 11:13:25.497127 systemd[1]: Started sshd@34-78.46.186.225:22-147.75.109.163:57042.service - OpenSSH per-connection server daemon (147.75.109.163:57042). Jan 29 11:13:26.491874 sshd[7012]: Accepted publickey for core from 147.75.109.163 port 57042 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:26.494020 sshd-session[7012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:26.500186 systemd-logind[1458]: New session 33 of user core. Jan 29 11:13:26.504990 systemd[1]: Started session-33.scope - Session 33 of User core. Jan 29 11:13:27.257697 sshd[7014]: Connection closed by 147.75.109.163 port 57042 Jan 29 11:13:27.257572 sshd-session[7012]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:27.262668 systemd[1]: sshd@34-78.46.186.225:22-147.75.109.163:57042.service: Deactivated successfully. Jan 29 11:13:27.265860 systemd[1]: session-33.scope: Deactivated successfully. Jan 29 11:13:27.267020 systemd-logind[1458]: Session 33 logged out. Waiting for processes to exit. Jan 29 11:13:27.268492 systemd-logind[1458]: Removed session 33. Jan 29 11:13:28.084183 systemd[1]: Started sshd@35-78.46.186.225:22-195.178.110.65:47240.service - OpenSSH per-connection server daemon (195.178.110.65:47240). Jan 29 11:13:28.158548 sshd[7026]: Invalid user ubuntu from 195.178.110.65 port 47240 Jan 29 11:13:28.170537 sshd[7026]: Connection closed by invalid user ubuntu 195.178.110.65 port 47240 [preauth] Jan 29 11:13:28.173319 systemd[1]: sshd@35-78.46.186.225:22-195.178.110.65:47240.service: Deactivated successfully. Jan 29 11:13:32.435202 systemd[1]: Started sshd@36-78.46.186.225:22-147.75.109.163:57588.service - OpenSSH per-connection server daemon (147.75.109.163:57588). Jan 29 11:13:33.419221 sshd[7031]: Accepted publickey for core from 147.75.109.163 port 57588 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:33.421475 sshd-session[7031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:33.428860 systemd-logind[1458]: New session 34 of user core. Jan 29 11:13:33.432262 systemd[1]: Started session-34.scope - Session 34 of User core. Jan 29 11:13:34.171316 sshd[7033]: Connection closed by 147.75.109.163 port 57588 Jan 29 11:13:34.172363 sshd-session[7031]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:34.178012 systemd[1]: sshd@36-78.46.186.225:22-147.75.109.163:57588.service: Deactivated successfully. Jan 29 11:13:34.178327 systemd-logind[1458]: Session 34 logged out. Waiting for processes to exit. Jan 29 11:13:34.180528 systemd[1]: session-34.scope: Deactivated successfully. Jan 29 11:13:34.182292 systemd-logind[1458]: Removed session 34. Jan 29 11:13:39.351308 systemd[1]: Started sshd@37-78.46.186.225:22-147.75.109.163:36342.service - OpenSSH per-connection server daemon (147.75.109.163:36342). Jan 29 11:13:40.324345 sshd[7044]: Accepted publickey for core from 147.75.109.163 port 36342 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:40.326297 sshd-session[7044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:40.333510 systemd-logind[1458]: New session 35 of user core. Jan 29 11:13:40.339015 systemd[1]: Started session-35.scope - Session 35 of User core. Jan 29 11:13:41.079203 sshd[7047]: Connection closed by 147.75.109.163 port 36342 Jan 29 11:13:41.080237 sshd-session[7044]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:41.084645 systemd[1]: sshd@37-78.46.186.225:22-147.75.109.163:36342.service: Deactivated successfully. Jan 29 11:13:41.087699 systemd[1]: session-35.scope: Deactivated successfully. Jan 29 11:13:41.089974 systemd-logind[1458]: Session 35 logged out. Waiting for processes to exit. Jan 29 11:13:41.091451 systemd-logind[1458]: Removed session 35. Jan 29 11:13:46.260191 systemd[1]: Started sshd@38-78.46.186.225:22-147.75.109.163:36354.service - OpenSSH per-connection server daemon (147.75.109.163:36354). Jan 29 11:13:47.246206 sshd[7079]: Accepted publickey for core from 147.75.109.163 port 36354 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:47.248720 sshd-session[7079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:47.256593 systemd-logind[1458]: New session 36 of user core. Jan 29 11:13:47.258976 systemd[1]: Started session-36.scope - Session 36 of User core. Jan 29 11:13:48.005439 sshd[7081]: Connection closed by 147.75.109.163 port 36354 Jan 29 11:13:48.006687 sshd-session[7079]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:48.011109 systemd[1]: sshd@38-78.46.186.225:22-147.75.109.163:36354.service: Deactivated successfully. Jan 29 11:13:48.014047 systemd[1]: session-36.scope: Deactivated successfully. Jan 29 11:13:48.015060 systemd-logind[1458]: Session 36 logged out. Waiting for processes to exit. Jan 29 11:13:48.016608 systemd-logind[1458]: Removed session 36. Jan 29 11:13:48.897228 update_engine[1461]: I20250129 11:13:48.897109 1461 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 29 11:13:48.897228 update_engine[1461]: I20250129 11:13:48.897192 1461 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 29 11:13:48.897928 update_engine[1461]: I20250129 11:13:48.897660 1461 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 29 11:13:48.899836 update_engine[1461]: I20250129 11:13:48.899655 1461 omaha_request_params.cc:62] Current group set to stable Jan 29 11:13:48.901218 update_engine[1461]: I20250129 11:13:48.901186 1461 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 29 11:13:48.901847 update_engine[1461]: I20250129 11:13:48.901306 1461 update_attempter.cc:643] Scheduling an action processor start. Jan 29 11:13:48.901847 update_engine[1461]: I20250129 11:13:48.901332 1461 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 29 11:13:48.901847 update_engine[1461]: I20250129 11:13:48.901372 1461 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 29 11:13:48.901847 update_engine[1461]: I20250129 11:13:48.901430 1461 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 29 11:13:48.901847 update_engine[1461]: I20250129 11:13:48.901438 1461 omaha_request_action.cc:272] Request: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: Jan 29 11:13:48.901847 update_engine[1461]: I20250129 11:13:48.901444 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 11:13:48.906838 update_engine[1461]: I20250129 11:13:48.904907 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 11:13:48.907469 update_engine[1461]: I20250129 11:13:48.907346 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 11:13:48.907637 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 29 11:13:48.909739 update_engine[1461]: E20250129 11:13:48.909654 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 11:13:48.915209 update_engine[1461]: I20250129 11:13:48.915146 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 29 11:13:53.182153 systemd[1]: Started sshd@39-78.46.186.225:22-147.75.109.163:46066.service - OpenSSH per-connection server daemon (147.75.109.163:46066). Jan 29 11:13:54.169684 sshd[7118]: Accepted publickey for core from 147.75.109.163 port 46066 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:13:54.172568 sshd-session[7118]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:13:54.179390 systemd-logind[1458]: New session 37 of user core. Jan 29 11:13:54.185125 systemd[1]: Started session-37.scope - Session 37 of User core. Jan 29 11:13:54.921059 sshd[7120]: Connection closed by 147.75.109.163 port 46066 Jan 29 11:13:54.922109 sshd-session[7118]: pam_unix(sshd:session): session closed for user core Jan 29 11:13:54.926843 systemd-logind[1458]: Session 37 logged out. Waiting for processes to exit. Jan 29 11:13:54.927850 systemd[1]: sshd@39-78.46.186.225:22-147.75.109.163:46066.service: Deactivated successfully. Jan 29 11:13:54.930637 systemd[1]: session-37.scope: Deactivated successfully. Jan 29 11:13:54.932884 systemd-logind[1458]: Removed session 37. Jan 29 11:13:58.896905 update_engine[1461]: I20250129 11:13:58.896742 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 11:13:58.897588 update_engine[1461]: I20250129 11:13:58.897177 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 11:13:58.897588 update_engine[1461]: I20250129 11:13:58.897553 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 11:13:58.898102 update_engine[1461]: E20250129 11:13:58.898026 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 11:13:58.898202 update_engine[1461]: I20250129 11:13:58.898130 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 29 11:14:00.103136 systemd[1]: Started sshd@40-78.46.186.225:22-147.75.109.163:51088.service - OpenSSH per-connection server daemon (147.75.109.163:51088). Jan 29 11:14:01.096251 sshd[7134]: Accepted publickey for core from 147.75.109.163 port 51088 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:01.098131 sshd-session[7134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:01.102877 systemd-logind[1458]: New session 38 of user core. Jan 29 11:14:01.107998 systemd[1]: Started session-38.scope - Session 38 of User core. Jan 29 11:14:01.864011 sshd[7136]: Connection closed by 147.75.109.163 port 51088 Jan 29 11:14:01.865002 sshd-session[7134]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:01.871607 systemd-logind[1458]: Session 38 logged out. Waiting for processes to exit. Jan 29 11:14:01.871653 systemd[1]: sshd@40-78.46.186.225:22-147.75.109.163:51088.service: Deactivated successfully. Jan 29 11:14:01.873576 systemd[1]: session-38.scope: Deactivated successfully. Jan 29 11:14:01.879800 systemd-logind[1458]: Removed session 38. Jan 29 11:14:07.042265 systemd[1]: Started sshd@41-78.46.186.225:22-147.75.109.163:51094.service - OpenSSH per-connection server daemon (147.75.109.163:51094). Jan 29 11:14:08.029722 sshd[7167]: Accepted publickey for core from 147.75.109.163 port 51094 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:08.032089 sshd-session[7167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:08.036826 systemd-logind[1458]: New session 39 of user core. Jan 29 11:14:08.043101 systemd[1]: Started session-39.scope - Session 39 of User core. Jan 29 11:14:08.786886 sshd[7169]: Connection closed by 147.75.109.163 port 51094 Jan 29 11:14:08.787684 sshd-session[7167]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:08.792746 systemd[1]: sshd@41-78.46.186.225:22-147.75.109.163:51094.service: Deactivated successfully. Jan 29 11:14:08.795444 systemd[1]: session-39.scope: Deactivated successfully. Jan 29 11:14:08.796492 systemd-logind[1458]: Session 39 logged out. Waiting for processes to exit. Jan 29 11:14:08.797395 systemd-logind[1458]: Removed session 39. Jan 29 11:14:08.896629 update_engine[1461]: I20250129 11:14:08.896513 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 11:14:08.897259 update_engine[1461]: I20250129 11:14:08.896948 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 11:14:08.897335 update_engine[1461]: I20250129 11:14:08.897287 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 11:14:08.897923 update_engine[1461]: E20250129 11:14:08.897859 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 11:14:08.898057 update_engine[1461]: I20250129 11:14:08.897953 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 29 11:14:13.962052 systemd[1]: Started sshd@42-78.46.186.225:22-147.75.109.163:40142.service - OpenSSH per-connection server daemon (147.75.109.163:40142). Jan 29 11:14:14.952311 sshd[7205]: Accepted publickey for core from 147.75.109.163 port 40142 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:14.954358 sshd-session[7205]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:14.960129 systemd-logind[1458]: New session 40 of user core. Jan 29 11:14:14.968133 systemd[1]: Started session-40.scope - Session 40 of User core. Jan 29 11:14:15.732045 sshd[7207]: Connection closed by 147.75.109.163 port 40142 Jan 29 11:14:15.733587 sshd-session[7205]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:15.739154 systemd[1]: sshd@42-78.46.186.225:22-147.75.109.163:40142.service: Deactivated successfully. Jan 29 11:14:15.739688 systemd-logind[1458]: Session 40 logged out. Waiting for processes to exit. Jan 29 11:14:15.742662 systemd[1]: session-40.scope: Deactivated successfully. Jan 29 11:14:15.744133 systemd-logind[1458]: Removed session 40. Jan 29 11:14:18.898676 update_engine[1461]: I20250129 11:14:18.898561 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 11:14:18.899410 update_engine[1461]: I20250129 11:14:18.899070 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 11:14:18.899467 update_engine[1461]: I20250129 11:14:18.899402 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 11:14:18.900007 update_engine[1461]: E20250129 11:14:18.899940 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 11:14:18.900129 update_engine[1461]: I20250129 11:14:18.900022 1461 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 29 11:14:18.900129 update_engine[1461]: I20250129 11:14:18.900033 1461 omaha_request_action.cc:617] Omaha request response: Jan 29 11:14:18.900224 update_engine[1461]: E20250129 11:14:18.900135 1461 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 29 11:14:18.900224 update_engine[1461]: I20250129 11:14:18.900156 1461 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 29 11:14:18.900224 update_engine[1461]: I20250129 11:14:18.900163 1461 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 29 11:14:18.900224 update_engine[1461]: I20250129 11:14:18.900169 1461 update_attempter.cc:306] Processing Done. Jan 29 11:14:18.900224 update_engine[1461]: E20250129 11:14:18.900187 1461 update_attempter.cc:619] Update failed. Jan 29 11:14:18.900224 update_engine[1461]: I20250129 11:14:18.900195 1461 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 29 11:14:18.900224 update_engine[1461]: I20250129 11:14:18.900202 1461 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 29 11:14:18.900224 update_engine[1461]: I20250129 11:14:18.900207 1461 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 29 11:14:18.900741 update_engine[1461]: I20250129 11:14:18.900281 1461 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 29 11:14:18.900741 update_engine[1461]: I20250129 11:14:18.900305 1461 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 29 11:14:18.900741 update_engine[1461]: I20250129 11:14:18.900311 1461 omaha_request_action.cc:272] Request: Jan 29 11:14:18.900741 update_engine[1461]: Jan 29 11:14:18.900741 update_engine[1461]: Jan 29 11:14:18.900741 update_engine[1461]: Jan 29 11:14:18.900741 update_engine[1461]: Jan 29 11:14:18.900741 update_engine[1461]: Jan 29 11:14:18.900741 update_engine[1461]: Jan 29 11:14:18.900741 update_engine[1461]: I20250129 11:14:18.900318 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 11:14:18.900741 update_engine[1461]: I20250129 11:14:18.900475 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 11:14:18.901331 update_engine[1461]: I20250129 11:14:18.900882 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 11:14:18.901331 update_engine[1461]: E20250129 11:14:18.901270 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 11:14:18.901331 update_engine[1461]: I20250129 11:14:18.901319 1461 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 29 11:14:18.901331 update_engine[1461]: I20250129 11:14:18.901329 1461 omaha_request_action.cc:617] Omaha request response: Jan 29 11:14:18.901489 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 29 11:14:18.901898 update_engine[1461]: I20250129 11:14:18.901336 1461 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 29 11:14:18.901898 update_engine[1461]: I20250129 11:14:18.901343 1461 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 29 11:14:18.901898 update_engine[1461]: I20250129 11:14:18.901348 1461 update_attempter.cc:306] Processing Done. Jan 29 11:14:18.901898 update_engine[1461]: I20250129 11:14:18.901355 1461 update_attempter.cc:310] Error event sent. Jan 29 11:14:18.901898 update_engine[1461]: I20250129 11:14:18.901364 1461 update_check_scheduler.cc:74] Next update check in 42m8s Jan 29 11:14:18.902237 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 29 11:14:20.910367 systemd[1]: Started sshd@43-78.46.186.225:22-147.75.109.163:60774.service - OpenSSH per-connection server daemon (147.75.109.163:60774). Jan 29 11:14:21.912096 sshd[7219]: Accepted publickey for core from 147.75.109.163 port 60774 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:21.914336 sshd-session[7219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:21.920105 systemd-logind[1458]: New session 41 of user core. Jan 29 11:14:21.926115 systemd[1]: Started session-41.scope - Session 41 of User core. Jan 29 11:14:22.674867 sshd[7221]: Connection closed by 147.75.109.163 port 60774 Jan 29 11:14:22.675407 sshd-session[7219]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:22.680927 systemd[1]: sshd@43-78.46.186.225:22-147.75.109.163:60774.service: Deactivated successfully. Jan 29 11:14:22.681698 systemd-logind[1458]: Session 41 logged out. Waiting for processes to exit. Jan 29 11:14:22.683244 systemd[1]: session-41.scope: Deactivated successfully. Jan 29 11:14:22.685654 systemd-logind[1458]: Removed session 41. Jan 29 11:14:27.853989 systemd[1]: Started sshd@44-78.46.186.225:22-147.75.109.163:45544.service - OpenSSH per-connection server daemon (147.75.109.163:45544). Jan 29 11:14:28.827067 sshd[7255]: Accepted publickey for core from 147.75.109.163 port 45544 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:28.829648 sshd-session[7255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:28.837649 systemd-logind[1458]: New session 42 of user core. Jan 29 11:14:28.847145 systemd[1]: Started session-42.scope - Session 42 of User core. Jan 29 11:14:29.572066 sshd[7269]: Connection closed by 147.75.109.163 port 45544 Jan 29 11:14:29.573074 sshd-session[7255]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:29.578508 systemd-logind[1458]: Session 42 logged out. Waiting for processes to exit. Jan 29 11:14:29.578650 systemd[1]: sshd@44-78.46.186.225:22-147.75.109.163:45544.service: Deactivated successfully. Jan 29 11:14:29.582536 systemd[1]: session-42.scope: Deactivated successfully. Jan 29 11:14:29.584748 systemd-logind[1458]: Removed session 42. Jan 29 11:14:34.751229 systemd[1]: Started sshd@45-78.46.186.225:22-147.75.109.163:45548.service - OpenSSH per-connection server daemon (147.75.109.163:45548). Jan 29 11:14:35.743476 sshd[7281]: Accepted publickey for core from 147.75.109.163 port 45548 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:35.745501 sshd-session[7281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:35.751900 systemd-logind[1458]: New session 43 of user core. Jan 29 11:14:35.762119 systemd[1]: Started session-43.scope - Session 43 of User core. Jan 29 11:14:36.503280 sshd[7283]: Connection closed by 147.75.109.163 port 45548 Jan 29 11:14:36.504343 sshd-session[7281]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:36.508926 systemd[1]: sshd@45-78.46.186.225:22-147.75.109.163:45548.service: Deactivated successfully. Jan 29 11:14:36.512335 systemd[1]: session-43.scope: Deactivated successfully. Jan 29 11:14:36.515516 systemd-logind[1458]: Session 43 logged out. Waiting for processes to exit. Jan 29 11:14:36.517197 systemd-logind[1458]: Removed session 43. Jan 29 11:14:36.681317 systemd[1]: Started sshd@46-78.46.186.225:22-147.75.109.163:45556.service - OpenSSH per-connection server daemon (147.75.109.163:45556). Jan 29 11:14:37.672079 sshd[7294]: Accepted publickey for core from 147.75.109.163 port 45556 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:37.673409 sshd-session[7294]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:37.678682 systemd-logind[1458]: New session 44 of user core. Jan 29 11:14:37.682001 systemd[1]: Started session-44.scope - Session 44 of User core. Jan 29 11:14:38.466478 sshd[7296]: Connection closed by 147.75.109.163 port 45556 Jan 29 11:14:38.467883 sshd-session[7294]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:38.472728 systemd[1]: sshd@46-78.46.186.225:22-147.75.109.163:45556.service: Deactivated successfully. Jan 29 11:14:38.474717 systemd[1]: session-44.scope: Deactivated successfully. Jan 29 11:14:38.475529 systemd-logind[1458]: Session 44 logged out. Waiting for processes to exit. Jan 29 11:14:38.476706 systemd-logind[1458]: Removed session 44. Jan 29 11:14:38.642234 systemd[1]: Started sshd@47-78.46.186.225:22-147.75.109.163:48760.service - OpenSSH per-connection server daemon (147.75.109.163:48760). Jan 29 11:14:39.620713 sshd[7305]: Accepted publickey for core from 147.75.109.163 port 48760 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:39.623095 sshd-session[7305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:39.628182 systemd-logind[1458]: New session 45 of user core. Jan 29 11:14:39.633343 systemd[1]: Started session-45.scope - Session 45 of User core. Jan 29 11:14:40.369855 sshd[7307]: Connection closed by 147.75.109.163 port 48760 Jan 29 11:14:40.370681 sshd-session[7305]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:40.374768 systemd[1]: sshd@47-78.46.186.225:22-147.75.109.163:48760.service: Deactivated successfully. Jan 29 11:14:40.378087 systemd[1]: session-45.scope: Deactivated successfully. Jan 29 11:14:40.380344 systemd-logind[1458]: Session 45 logged out. Waiting for processes to exit. Jan 29 11:14:40.383176 systemd-logind[1458]: Removed session 45. Jan 29 11:14:45.544103 systemd[1]: Started sshd@48-78.46.186.225:22-147.75.109.163:48762.service - OpenSSH per-connection server daemon (147.75.109.163:48762). Jan 29 11:14:46.540379 sshd[7339]: Accepted publickey for core from 147.75.109.163 port 48762 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:46.542495 sshd-session[7339]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:46.547684 systemd-logind[1458]: New session 46 of user core. Jan 29 11:14:46.553995 systemd[1]: Started session-46.scope - Session 46 of User core. Jan 29 11:14:47.291072 sshd[7349]: Connection closed by 147.75.109.163 port 48762 Jan 29 11:14:47.292236 sshd-session[7339]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:47.298084 systemd[1]: sshd@48-78.46.186.225:22-147.75.109.163:48762.service: Deactivated successfully. Jan 29 11:14:47.301920 systemd[1]: session-46.scope: Deactivated successfully. Jan 29 11:14:47.304595 systemd-logind[1458]: Session 46 logged out. Waiting for processes to exit. Jan 29 11:14:47.306672 systemd-logind[1458]: Removed session 46. Jan 29 11:14:52.466569 systemd[1]: Started sshd@49-78.46.186.225:22-147.75.109.163:32976.service - OpenSSH per-connection server daemon (147.75.109.163:32976). Jan 29 11:14:53.448425 sshd[7382]: Accepted publickey for core from 147.75.109.163 port 32976 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:14:53.450395 sshd-session[7382]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:14:53.455945 systemd-logind[1458]: New session 47 of user core. Jan 29 11:14:53.461064 systemd[1]: Started session-47.scope - Session 47 of User core. Jan 29 11:14:54.204002 sshd[7384]: Connection closed by 147.75.109.163 port 32976 Jan 29 11:14:54.205060 sshd-session[7382]: pam_unix(sshd:session): session closed for user core Jan 29 11:14:54.209602 systemd[1]: sshd@49-78.46.186.225:22-147.75.109.163:32976.service: Deactivated successfully. Jan 29 11:14:54.211597 systemd[1]: session-47.scope: Deactivated successfully. Jan 29 11:14:54.213508 systemd-logind[1458]: Session 47 logged out. Waiting for processes to exit. Jan 29 11:14:54.214438 systemd-logind[1458]: Removed session 47. Jan 29 11:14:59.377183 systemd[1]: Started sshd@50-78.46.186.225:22-147.75.109.163:53916.service - OpenSSH per-connection server daemon (147.75.109.163:53916). Jan 29 11:15:00.356354 sshd[7397]: Accepted publickey for core from 147.75.109.163 port 53916 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:00.359290 sshd-session[7397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:00.365894 systemd-logind[1458]: New session 48 of user core. Jan 29 11:15:00.371050 systemd[1]: Started session-48.scope - Session 48 of User core. Jan 29 11:15:01.109453 sshd[7399]: Connection closed by 147.75.109.163 port 53916 Jan 29 11:15:01.110416 sshd-session[7397]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:01.117946 systemd[1]: sshd@50-78.46.186.225:22-147.75.109.163:53916.service: Deactivated successfully. Jan 29 11:15:01.121391 systemd[1]: session-48.scope: Deactivated successfully. Jan 29 11:15:01.122961 systemd-logind[1458]: Session 48 logged out. Waiting for processes to exit. Jan 29 11:15:01.124016 systemd-logind[1458]: Removed session 48. Jan 29 11:15:06.288294 systemd[1]: Started sshd@51-78.46.186.225:22-147.75.109.163:53928.service - OpenSSH per-connection server daemon (147.75.109.163:53928). Jan 29 11:15:07.283373 sshd[7429]: Accepted publickey for core from 147.75.109.163 port 53928 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:07.285237 sshd-session[7429]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:07.290930 systemd-logind[1458]: New session 49 of user core. Jan 29 11:15:07.296141 systemd[1]: Started session-49.scope - Session 49 of User core. Jan 29 11:15:08.052974 sshd[7431]: Connection closed by 147.75.109.163 port 53928 Jan 29 11:15:08.053862 sshd-session[7429]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:08.058760 systemd[1]: sshd@51-78.46.186.225:22-147.75.109.163:53928.service: Deactivated successfully. Jan 29 11:15:08.062028 systemd[1]: session-49.scope: Deactivated successfully. Jan 29 11:15:08.063272 systemd-logind[1458]: Session 49 logged out. Waiting for processes to exit. Jan 29 11:15:08.064477 systemd-logind[1458]: Removed session 49. Jan 29 11:15:13.230157 systemd[1]: Started sshd@52-78.46.186.225:22-147.75.109.163:43104.service - OpenSSH per-connection server daemon (147.75.109.163:43104). Jan 29 11:15:14.213581 sshd[7461]: Accepted publickey for core from 147.75.109.163 port 43104 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:14.215750 sshd-session[7461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:14.222582 systemd-logind[1458]: New session 50 of user core. Jan 29 11:15:14.228509 systemd[1]: Started session-50.scope - Session 50 of User core. Jan 29 11:15:14.967231 sshd[7463]: Connection closed by 147.75.109.163 port 43104 Jan 29 11:15:14.968722 sshd-session[7461]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:14.973967 systemd-logind[1458]: Session 50 logged out. Waiting for processes to exit. Jan 29 11:15:14.974949 systemd[1]: sshd@52-78.46.186.225:22-147.75.109.163:43104.service: Deactivated successfully. Jan 29 11:15:14.977621 systemd[1]: session-50.scope: Deactivated successfully. Jan 29 11:15:14.980352 systemd-logind[1458]: Removed session 50. Jan 29 11:15:20.149220 systemd[1]: Started sshd@53-78.46.186.225:22-147.75.109.163:53710.service - OpenSSH per-connection server daemon (147.75.109.163:53710). Jan 29 11:15:21.141907 sshd[7474]: Accepted publickey for core from 147.75.109.163 port 53710 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:21.143789 sshd-session[7474]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:21.148131 systemd-logind[1458]: New session 51 of user core. Jan 29 11:15:21.154558 systemd[1]: Started session-51.scope - Session 51 of User core. Jan 29 11:15:21.900218 sshd[7476]: Connection closed by 147.75.109.163 port 53710 Jan 29 11:15:21.904022 sshd-session[7474]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:21.908020 systemd-logind[1458]: Session 51 logged out. Waiting for processes to exit. Jan 29 11:15:21.908109 systemd[1]: sshd@53-78.46.186.225:22-147.75.109.163:53710.service: Deactivated successfully. Jan 29 11:15:21.912339 systemd[1]: session-51.scope: Deactivated successfully. Jan 29 11:15:21.914496 systemd-logind[1458]: Removed session 51. Jan 29 11:15:27.079277 systemd[1]: Started sshd@54-78.46.186.225:22-147.75.109.163:53716.service - OpenSSH per-connection server daemon (147.75.109.163:53716). Jan 29 11:15:28.073871 sshd[7510]: Accepted publickey for core from 147.75.109.163 port 53716 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:28.076097 sshd-session[7510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:28.081446 systemd-logind[1458]: New session 52 of user core. Jan 29 11:15:28.096144 systemd[1]: Started session-52.scope - Session 52 of User core. Jan 29 11:15:28.841661 sshd[7512]: Connection closed by 147.75.109.163 port 53716 Jan 29 11:15:28.842152 sshd-session[7510]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:28.846698 systemd-logind[1458]: Session 52 logged out. Waiting for processes to exit. Jan 29 11:15:28.847188 systemd[1]: sshd@54-78.46.186.225:22-147.75.109.163:53716.service: Deactivated successfully. Jan 29 11:15:28.848855 systemd[1]: session-52.scope: Deactivated successfully. Jan 29 11:15:28.850643 systemd-logind[1458]: Removed session 52. Jan 29 11:15:34.028124 systemd[1]: Started sshd@55-78.46.186.225:22-147.75.109.163:55382.service - OpenSSH per-connection server daemon (147.75.109.163:55382). Jan 29 11:15:35.024626 sshd[7525]: Accepted publickey for core from 147.75.109.163 port 55382 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:35.026996 sshd-session[7525]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:35.034292 systemd-logind[1458]: New session 53 of user core. Jan 29 11:15:35.039110 systemd[1]: Started session-53.scope - Session 53 of User core. Jan 29 11:15:35.790764 sshd[7527]: Connection closed by 147.75.109.163 port 55382 Jan 29 11:15:35.791648 sshd-session[7525]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:35.797179 systemd[1]: sshd@55-78.46.186.225:22-147.75.109.163:55382.service: Deactivated successfully. Jan 29 11:15:35.800056 systemd[1]: session-53.scope: Deactivated successfully. Jan 29 11:15:35.801526 systemd-logind[1458]: Session 53 logged out. Waiting for processes to exit. Jan 29 11:15:35.803538 systemd-logind[1458]: Removed session 53. Jan 29 11:15:40.964181 systemd[1]: Started sshd@56-78.46.186.225:22-147.75.109.163:57244.service - OpenSSH per-connection server daemon (147.75.109.163:57244). Jan 29 11:15:41.950590 sshd[7558]: Accepted publickey for core from 147.75.109.163 port 57244 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:41.953407 sshd-session[7558]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:41.958978 systemd-logind[1458]: New session 54 of user core. Jan 29 11:15:41.968130 systemd[1]: Started session-54.scope - Session 54 of User core. Jan 29 11:15:42.712229 sshd[7560]: Connection closed by 147.75.109.163 port 57244 Jan 29 11:15:42.712747 sshd-session[7558]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:42.720954 systemd[1]: sshd@56-78.46.186.225:22-147.75.109.163:57244.service: Deactivated successfully. Jan 29 11:15:42.724288 systemd[1]: session-54.scope: Deactivated successfully. Jan 29 11:15:42.725727 systemd-logind[1458]: Session 54 logged out. Waiting for processes to exit. Jan 29 11:15:42.727427 systemd-logind[1458]: Removed session 54. Jan 29 11:15:47.892278 systemd[1]: Started sshd@57-78.46.186.225:22-147.75.109.163:53514.service - OpenSSH per-connection server daemon (147.75.109.163:53514). Jan 29 11:15:48.886807 sshd[7570]: Accepted publickey for core from 147.75.109.163 port 53514 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:48.889044 sshd-session[7570]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:48.894587 systemd-logind[1458]: New session 55 of user core. Jan 29 11:15:48.903124 systemd[1]: Started session-55.scope - Session 55 of User core. Jan 29 11:15:49.650126 sshd[7572]: Connection closed by 147.75.109.163 port 53514 Jan 29 11:15:49.651108 sshd-session[7570]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:49.656596 systemd[1]: sshd@57-78.46.186.225:22-147.75.109.163:53514.service: Deactivated successfully. Jan 29 11:15:49.659927 systemd[1]: session-55.scope: Deactivated successfully. Jan 29 11:15:49.661048 systemd-logind[1458]: Session 55 logged out. Waiting for processes to exit. Jan 29 11:15:49.663354 systemd-logind[1458]: Removed session 55. Jan 29 11:15:54.827469 systemd[1]: Started sshd@58-78.46.186.225:22-147.75.109.163:53526.service - OpenSSH per-connection server daemon (147.75.109.163:53526). Jan 29 11:15:55.813178 sshd[7610]: Accepted publickey for core from 147.75.109.163 port 53526 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:15:55.814748 sshd-session[7610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:15:55.819969 systemd-logind[1458]: New session 56 of user core. Jan 29 11:15:55.826145 systemd[1]: Started session-56.scope - Session 56 of User core. Jan 29 11:15:56.575558 sshd[7614]: Connection closed by 147.75.109.163 port 53526 Jan 29 11:15:56.576150 sshd-session[7610]: pam_unix(sshd:session): session closed for user core Jan 29 11:15:56.581985 systemd[1]: sshd@58-78.46.186.225:22-147.75.109.163:53526.service: Deactivated successfully. Jan 29 11:15:56.584523 systemd[1]: session-56.scope: Deactivated successfully. Jan 29 11:15:56.586459 systemd-logind[1458]: Session 56 logged out. Waiting for processes to exit. Jan 29 11:15:56.587676 systemd-logind[1458]: Removed session 56. Jan 29 11:16:01.752183 systemd[1]: Started sshd@59-78.46.186.225:22-147.75.109.163:54014.service - OpenSSH per-connection server daemon (147.75.109.163:54014). Jan 29 11:16:02.736155 sshd[7637]: Accepted publickey for core from 147.75.109.163 port 54014 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:02.736716 sshd-session[7637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:02.741371 systemd-logind[1458]: New session 57 of user core. Jan 29 11:16:02.748163 systemd[1]: Started session-57.scope - Session 57 of User core. Jan 29 11:16:03.514106 sshd[7657]: Connection closed by 147.75.109.163 port 54014 Jan 29 11:16:03.514918 sshd-session[7637]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:03.519928 systemd[1]: sshd@59-78.46.186.225:22-147.75.109.163:54014.service: Deactivated successfully. Jan 29 11:16:03.522802 systemd[1]: session-57.scope: Deactivated successfully. Jan 29 11:16:03.526362 systemd-logind[1458]: Session 57 logged out. Waiting for processes to exit. Jan 29 11:16:03.527408 systemd-logind[1458]: Removed session 57. Jan 29 11:16:08.692258 systemd[1]: Started sshd@60-78.46.186.225:22-147.75.109.163:57046.service - OpenSSH per-connection server daemon (147.75.109.163:57046). Jan 29 11:16:09.679719 sshd[7668]: Accepted publickey for core from 147.75.109.163 port 57046 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:09.681859 sshd-session[7668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:09.687844 systemd-logind[1458]: New session 58 of user core. Jan 29 11:16:09.696004 systemd[1]: Started session-58.scope - Session 58 of User core. Jan 29 11:16:10.455182 sshd[7670]: Connection closed by 147.75.109.163 port 57046 Jan 29 11:16:10.456056 sshd-session[7668]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:10.461996 systemd[1]: session-58.scope: Deactivated successfully. Jan 29 11:16:10.463793 systemd[1]: sshd@60-78.46.186.225:22-147.75.109.163:57046.service: Deactivated successfully. Jan 29 11:16:10.467787 systemd-logind[1458]: Session 58 logged out. Waiting for processes to exit. Jan 29 11:16:10.469183 systemd-logind[1458]: Removed session 58. Jan 29 11:16:15.628808 systemd[1]: Started sshd@61-78.46.186.225:22-147.75.109.163:57062.service - OpenSSH per-connection server daemon (147.75.109.163:57062). Jan 29 11:16:16.639987 sshd[7699]: Accepted publickey for core from 147.75.109.163 port 57062 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:16.642255 sshd-session[7699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:16.647772 systemd-logind[1458]: New session 59 of user core. Jan 29 11:16:16.653255 systemd[1]: Started session-59.scope - Session 59 of User core. Jan 29 11:16:17.393147 sshd[7701]: Connection closed by 147.75.109.163 port 57062 Jan 29 11:16:17.393910 sshd-session[7699]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:17.397430 systemd[1]: sshd@61-78.46.186.225:22-147.75.109.163:57062.service: Deactivated successfully. Jan 29 11:16:17.400766 systemd[1]: session-59.scope: Deactivated successfully. Jan 29 11:16:17.403518 systemd-logind[1458]: Session 59 logged out. Waiting for processes to exit. Jan 29 11:16:17.405191 systemd-logind[1458]: Removed session 59. Jan 29 11:16:22.567095 systemd[1]: Started sshd@62-78.46.186.225:22-147.75.109.163:57696.service - OpenSSH per-connection server daemon (147.75.109.163:57696). Jan 29 11:16:23.540795 sshd[7732]: Accepted publickey for core from 147.75.109.163 port 57696 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:23.541771 sshd-session[7732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:23.545980 systemd-logind[1458]: New session 60 of user core. Jan 29 11:16:23.554018 systemd[1]: Started session-60.scope - Session 60 of User core. Jan 29 11:16:24.291975 sshd[7734]: Connection closed by 147.75.109.163 port 57696 Jan 29 11:16:24.291856 sshd-session[7732]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:24.296621 systemd[1]: sshd@62-78.46.186.225:22-147.75.109.163:57696.service: Deactivated successfully. Jan 29 11:16:24.299169 systemd[1]: session-60.scope: Deactivated successfully. Jan 29 11:16:24.302492 systemd-logind[1458]: Session 60 logged out. Waiting for processes to exit. Jan 29 11:16:24.304620 systemd-logind[1458]: Removed session 60. Jan 29 11:16:29.472133 systemd[1]: Started sshd@63-78.46.186.225:22-147.75.109.163:54754.service - OpenSSH per-connection server daemon (147.75.109.163:54754). Jan 29 11:16:30.467305 sshd[7746]: Accepted publickey for core from 147.75.109.163 port 54754 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:30.469471 sshd-session[7746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:30.475807 systemd-logind[1458]: New session 61 of user core. Jan 29 11:16:30.484175 systemd[1]: Started session-61.scope - Session 61 of User core. Jan 29 11:16:31.223594 sshd[7748]: Connection closed by 147.75.109.163 port 54754 Jan 29 11:16:31.224140 sshd-session[7746]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:31.229152 systemd-logind[1458]: Session 61 logged out. Waiting for processes to exit. Jan 29 11:16:31.230000 systemd[1]: sshd@63-78.46.186.225:22-147.75.109.163:54754.service: Deactivated successfully. Jan 29 11:16:31.233426 systemd[1]: session-61.scope: Deactivated successfully. Jan 29 11:16:31.235080 systemd-logind[1458]: Removed session 61. Jan 29 11:16:36.402318 systemd[1]: Started sshd@64-78.46.186.225:22-147.75.109.163:54762.service - OpenSSH per-connection server daemon (147.75.109.163:54762). Jan 29 11:16:37.393867 sshd[7760]: Accepted publickey for core from 147.75.109.163 port 54762 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:37.395281 sshd-session[7760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:37.401135 systemd-logind[1458]: New session 62 of user core. Jan 29 11:16:37.409993 systemd[1]: Started session-62.scope - Session 62 of User core. Jan 29 11:16:38.150997 sshd[7762]: Connection closed by 147.75.109.163 port 54762 Jan 29 11:16:38.150866 sshd-session[7760]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:38.155973 systemd-logind[1458]: Session 62 logged out. Waiting for processes to exit. Jan 29 11:16:38.157193 systemd[1]: sshd@64-78.46.186.225:22-147.75.109.163:54762.service: Deactivated successfully. Jan 29 11:16:38.160170 systemd[1]: session-62.scope: Deactivated successfully. Jan 29 11:16:38.161435 systemd-logind[1458]: Removed session 62. Jan 29 11:16:43.330181 systemd[1]: Started sshd@65-78.46.186.225:22-147.75.109.163:39812.service - OpenSSH per-connection server daemon (147.75.109.163:39812). Jan 29 11:16:44.325133 sshd[7794]: Accepted publickey for core from 147.75.109.163 port 39812 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:44.327344 sshd-session[7794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:44.332113 systemd-logind[1458]: New session 63 of user core. Jan 29 11:16:44.338177 systemd[1]: Started session-63.scope - Session 63 of User core. Jan 29 11:16:45.093972 sshd[7797]: Connection closed by 147.75.109.163 port 39812 Jan 29 11:16:45.094721 sshd-session[7794]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:45.101517 systemd[1]: sshd@65-78.46.186.225:22-147.75.109.163:39812.service: Deactivated successfully. Jan 29 11:16:45.104584 systemd[1]: session-63.scope: Deactivated successfully. Jan 29 11:16:45.106410 systemd-logind[1458]: Session 63 logged out. Waiting for processes to exit. Jan 29 11:16:45.107603 systemd-logind[1458]: Removed session 63. Jan 29 11:16:50.268115 systemd[1]: Started sshd@66-78.46.186.225:22-147.75.109.163:55416.service - OpenSSH per-connection server daemon (147.75.109.163:55416). Jan 29 11:16:51.242371 sshd[7809]: Accepted publickey for core from 147.75.109.163 port 55416 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:51.244096 sshd-session[7809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:51.248905 systemd-logind[1458]: New session 64 of user core. Jan 29 11:16:51.258096 systemd[1]: Started session-64.scope - Session 64 of User core. Jan 29 11:16:51.997328 sshd[7811]: Connection closed by 147.75.109.163 port 55416 Jan 29 11:16:51.997993 sshd-session[7809]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:52.003449 systemd[1]: sshd@66-78.46.186.225:22-147.75.109.163:55416.service: Deactivated successfully. Jan 29 11:16:52.005541 systemd[1]: session-64.scope: Deactivated successfully. Jan 29 11:16:52.006454 systemd-logind[1458]: Session 64 logged out. Waiting for processes to exit. Jan 29 11:16:52.008174 systemd-logind[1458]: Removed session 64. Jan 29 11:16:57.176152 systemd[1]: Started sshd@67-78.46.186.225:22-147.75.109.163:55424.service - OpenSSH per-connection server daemon (147.75.109.163:55424). Jan 29 11:16:58.155781 sshd[7847]: Accepted publickey for core from 147.75.109.163 port 55424 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:16:58.158152 sshd-session[7847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:16:58.163414 systemd-logind[1458]: New session 65 of user core. Jan 29 11:16:58.170562 systemd[1]: Started session-65.scope - Session 65 of User core. Jan 29 11:16:58.918526 sshd[7849]: Connection closed by 147.75.109.163 port 55424 Jan 29 11:16:58.919295 sshd-session[7847]: pam_unix(sshd:session): session closed for user core Jan 29 11:16:58.924155 systemd[1]: sshd@67-78.46.186.225:22-147.75.109.163:55424.service: Deactivated successfully. Jan 29 11:16:58.926621 systemd[1]: session-65.scope: Deactivated successfully. Jan 29 11:16:58.928507 systemd-logind[1458]: Session 65 logged out. Waiting for processes to exit. Jan 29 11:16:58.930270 systemd-logind[1458]: Removed session 65. Jan 29 11:17:04.101331 systemd[1]: Started sshd@68-78.46.186.225:22-147.75.109.163:43074.service - OpenSSH per-connection server daemon (147.75.109.163:43074). Jan 29 11:17:05.088267 sshd[7878]: Accepted publickey for core from 147.75.109.163 port 43074 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:05.090162 sshd-session[7878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:05.096130 systemd-logind[1458]: New session 66 of user core. Jan 29 11:17:05.102021 systemd[1]: Started session-66.scope - Session 66 of User core. Jan 29 11:17:05.853968 sshd[7880]: Connection closed by 147.75.109.163 port 43074 Jan 29 11:17:05.856077 sshd-session[7878]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:05.861694 systemd[1]: sshd@68-78.46.186.225:22-147.75.109.163:43074.service: Deactivated successfully. Jan 29 11:17:05.864469 systemd[1]: session-66.scope: Deactivated successfully. Jan 29 11:17:05.867258 systemd-logind[1458]: Session 66 logged out. Waiting for processes to exit. Jan 29 11:17:05.869548 systemd-logind[1458]: Removed session 66. Jan 29 11:17:11.034365 systemd[1]: Started sshd@69-78.46.186.225:22-147.75.109.163:37448.service - OpenSSH per-connection server daemon (147.75.109.163:37448). Jan 29 11:17:12.025552 sshd[7917]: Accepted publickey for core from 147.75.109.163 port 37448 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:12.028511 sshd-session[7917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:12.035040 systemd-logind[1458]: New session 67 of user core. Jan 29 11:17:12.038073 systemd[1]: Started session-67.scope - Session 67 of User core. Jan 29 11:17:12.784114 sshd[7920]: Connection closed by 147.75.109.163 port 37448 Jan 29 11:17:12.785150 sshd-session[7917]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:12.791950 systemd[1]: sshd@69-78.46.186.225:22-147.75.109.163:37448.service: Deactivated successfully. Jan 29 11:17:12.795429 systemd[1]: session-67.scope: Deactivated successfully. Jan 29 11:17:12.798505 systemd-logind[1458]: Session 67 logged out. Waiting for processes to exit. Jan 29 11:17:12.801383 systemd-logind[1458]: Removed session 67. Jan 29 11:17:17.957114 systemd[1]: Started sshd@70-78.46.186.225:22-147.75.109.163:59114.service - OpenSSH per-connection server daemon (147.75.109.163:59114). Jan 29 11:17:18.935050 sshd[7931]: Accepted publickey for core from 147.75.109.163 port 59114 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:18.936768 sshd-session[7931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:18.941893 systemd-logind[1458]: New session 68 of user core. Jan 29 11:17:18.949009 systemd[1]: Started session-68.scope - Session 68 of User core. Jan 29 11:17:19.687386 sshd[7933]: Connection closed by 147.75.109.163 port 59114 Jan 29 11:17:19.687271 sshd-session[7931]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:19.691889 systemd[1]: sshd@70-78.46.186.225:22-147.75.109.163:59114.service: Deactivated successfully. Jan 29 11:17:19.696888 systemd[1]: session-68.scope: Deactivated successfully. Jan 29 11:17:19.700347 systemd-logind[1458]: Session 68 logged out. Waiting for processes to exit. Jan 29 11:17:19.702863 systemd-logind[1458]: Removed session 68. Jan 29 11:17:22.379347 systemd[1]: run-containerd-runc-k8s.io-9b843505722182c179c1058f3c0470699582b0bf4fcbe36794fd4e0ab6213154-runc.WUb0OV.mount: Deactivated successfully. Jan 29 11:17:24.864398 systemd[1]: Started sshd@71-78.46.186.225:22-147.75.109.163:59116.service - OpenSSH per-connection server daemon (147.75.109.163:59116). Jan 29 11:17:25.854352 sshd[7969]: Accepted publickey for core from 147.75.109.163 port 59116 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:25.856274 sshd-session[7969]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:25.861180 systemd-logind[1458]: New session 69 of user core. Jan 29 11:17:25.868058 systemd[1]: Started session-69.scope - Session 69 of User core. Jan 29 11:17:26.615594 sshd[7973]: Connection closed by 147.75.109.163 port 59116 Jan 29 11:17:26.616544 sshd-session[7969]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:26.622689 systemd[1]: sshd@71-78.46.186.225:22-147.75.109.163:59116.service: Deactivated successfully. Jan 29 11:17:26.625448 systemd[1]: session-69.scope: Deactivated successfully. Jan 29 11:17:26.626711 systemd-logind[1458]: Session 69 logged out. Waiting for processes to exit. Jan 29 11:17:26.628048 systemd-logind[1458]: Removed session 69. Jan 29 11:17:31.791339 systemd[1]: Started sshd@72-78.46.186.225:22-147.75.109.163:37576.service - OpenSSH per-connection server daemon (147.75.109.163:37576). Jan 29 11:17:32.773494 sshd[7983]: Accepted publickey for core from 147.75.109.163 port 37576 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:32.776241 sshd-session[7983]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:32.782884 systemd-logind[1458]: New session 70 of user core. Jan 29 11:17:32.788030 systemd[1]: Started session-70.scope - Session 70 of User core. Jan 29 11:17:33.535521 sshd[7985]: Connection closed by 147.75.109.163 port 37576 Jan 29 11:17:33.534764 sshd-session[7983]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:33.540693 systemd[1]: sshd@72-78.46.186.225:22-147.75.109.163:37576.service: Deactivated successfully. Jan 29 11:17:33.544792 systemd[1]: session-70.scope: Deactivated successfully. Jan 29 11:17:33.547387 systemd-logind[1458]: Session 70 logged out. Waiting for processes to exit. Jan 29 11:17:33.549391 systemd-logind[1458]: Removed session 70. Jan 29 11:17:38.702902 systemd[1]: Started sshd@73-78.46.186.225:22-147.75.109.163:45144.service - OpenSSH per-connection server daemon (147.75.109.163:45144). Jan 29 11:17:39.684260 sshd[8008]: Accepted publickey for core from 147.75.109.163 port 45144 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:39.686289 sshd-session[8008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:39.691346 systemd-logind[1458]: New session 71 of user core. Jan 29 11:17:39.697116 systemd[1]: Started session-71.scope - Session 71 of User core. Jan 29 11:17:40.436988 sshd[8010]: Connection closed by 147.75.109.163 port 45144 Jan 29 11:17:40.437944 sshd-session[8008]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:40.443043 systemd[1]: sshd@73-78.46.186.225:22-147.75.109.163:45144.service: Deactivated successfully. Jan 29 11:17:40.446647 systemd[1]: session-71.scope: Deactivated successfully. Jan 29 11:17:40.447785 systemd-logind[1458]: Session 71 logged out. Waiting for processes to exit. Jan 29 11:17:40.449661 systemd-logind[1458]: Removed session 71. Jan 29 11:17:45.623430 systemd[1]: Started sshd@74-78.46.186.225:22-147.75.109.163:45154.service - OpenSSH per-connection server daemon (147.75.109.163:45154). Jan 29 11:17:46.612408 sshd[8042]: Accepted publickey for core from 147.75.109.163 port 45154 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:46.614557 sshd-session[8042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:46.619166 systemd-logind[1458]: New session 72 of user core. Jan 29 11:17:46.624032 systemd[1]: Started session-72.scope - Session 72 of User core. Jan 29 11:17:47.360802 sshd[8044]: Connection closed by 147.75.109.163 port 45154 Jan 29 11:17:47.361934 sshd-session[8042]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:47.366876 systemd[1]: sshd@74-78.46.186.225:22-147.75.109.163:45154.service: Deactivated successfully. Jan 29 11:17:47.369794 systemd[1]: session-72.scope: Deactivated successfully. Jan 29 11:17:47.371746 systemd-logind[1458]: Session 72 logged out. Waiting for processes to exit. Jan 29 11:17:47.373063 systemd-logind[1458]: Removed session 72. Jan 29 11:17:52.545274 systemd[1]: Started sshd@75-78.46.186.225:22-147.75.109.163:55088.service - OpenSSH per-connection server daemon (147.75.109.163:55088). Jan 29 11:17:53.540811 sshd[8078]: Accepted publickey for core from 147.75.109.163 port 55088 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:17:53.542669 sshd-session[8078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:17:53.547944 systemd-logind[1458]: New session 73 of user core. Jan 29 11:17:53.559131 systemd[1]: Started session-73.scope - Session 73 of User core. Jan 29 11:17:54.313902 sshd[8080]: Connection closed by 147.75.109.163 port 55088 Jan 29 11:17:54.314890 sshd-session[8078]: pam_unix(sshd:session): session closed for user core Jan 29 11:17:54.320643 systemd[1]: sshd@75-78.46.186.225:22-147.75.109.163:55088.service: Deactivated successfully. Jan 29 11:17:54.324658 systemd[1]: session-73.scope: Deactivated successfully. Jan 29 11:17:54.326993 systemd-logind[1458]: Session 73 logged out. Waiting for processes to exit. Jan 29 11:17:54.328872 systemd-logind[1458]: Removed session 73. Jan 29 11:17:59.488092 systemd[1]: Started sshd@76-78.46.186.225:22-147.75.109.163:39930.service - OpenSSH per-connection server daemon (147.75.109.163:39930). Jan 29 11:18:00.471951 sshd[8093]: Accepted publickey for core from 147.75.109.163 port 39930 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:00.473671 sshd-session[8093]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:00.478251 systemd-logind[1458]: New session 74 of user core. Jan 29 11:18:00.484995 systemd[1]: Started session-74.scope - Session 74 of User core. Jan 29 11:18:01.219623 sshd[8095]: Connection closed by 147.75.109.163 port 39930 Jan 29 11:18:01.220687 sshd-session[8093]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:01.226117 systemd[1]: sshd@76-78.46.186.225:22-147.75.109.163:39930.service: Deactivated successfully. Jan 29 11:18:01.230409 systemd[1]: session-74.scope: Deactivated successfully. Jan 29 11:18:01.231708 systemd-logind[1458]: Session 74 logged out. Waiting for processes to exit. Jan 29 11:18:01.233594 systemd-logind[1458]: Removed session 74. Jan 29 11:18:06.401116 systemd[1]: Started sshd@77-78.46.186.225:22-147.75.109.163:39940.service - OpenSSH per-connection server daemon (147.75.109.163:39940). Jan 29 11:18:06.405206 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Jan 29 11:18:06.436885 systemd-tmpfiles[8127]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 29 11:18:06.437364 systemd-tmpfiles[8127]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 29 11:18:06.438120 systemd-tmpfiles[8127]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 29 11:18:06.438404 systemd-tmpfiles[8127]: ACLs are not supported, ignoring. Jan 29 11:18:06.438495 systemd-tmpfiles[8127]: ACLs are not supported, ignoring. Jan 29 11:18:06.441662 systemd-tmpfiles[8127]: Detected autofs mount point /boot during canonicalization of boot. Jan 29 11:18:06.441678 systemd-tmpfiles[8127]: Skipping /boot Jan 29 11:18:06.448970 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Jan 29 11:18:06.449341 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Jan 29 11:18:07.383794 sshd[8126]: Accepted publickey for core from 147.75.109.163 port 39940 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:07.386138 sshd-session[8126]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:07.392434 systemd-logind[1458]: New session 75 of user core. Jan 29 11:18:07.400094 systemd[1]: Started session-75.scope - Session 75 of User core. Jan 29 11:18:08.143623 sshd[8131]: Connection closed by 147.75.109.163 port 39940 Jan 29 11:18:08.144329 sshd-session[8126]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:08.148599 systemd[1]: sshd@77-78.46.186.225:22-147.75.109.163:39940.service: Deactivated successfully. Jan 29 11:18:08.151653 systemd[1]: session-75.scope: Deactivated successfully. Jan 29 11:18:08.153407 systemd-logind[1458]: Session 75 logged out. Waiting for processes to exit. Jan 29 11:18:08.155279 systemd-logind[1458]: Removed session 75. Jan 29 11:18:13.319191 systemd[1]: Started sshd@78-78.46.186.225:22-147.75.109.163:55300.service - OpenSSH per-connection server daemon (147.75.109.163:55300). Jan 29 11:18:14.305168 sshd[8161]: Accepted publickey for core from 147.75.109.163 port 55300 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:14.306923 sshd-session[8161]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:14.311802 systemd-logind[1458]: New session 76 of user core. Jan 29 11:18:14.316005 systemd[1]: Started session-76.scope - Session 76 of User core. Jan 29 11:18:15.063555 sshd[8163]: Connection closed by 147.75.109.163 port 55300 Jan 29 11:18:15.063445 sshd-session[8161]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:15.068559 systemd[1]: sshd@78-78.46.186.225:22-147.75.109.163:55300.service: Deactivated successfully. Jan 29 11:18:15.072286 systemd[1]: session-76.scope: Deactivated successfully. Jan 29 11:18:15.075554 systemd-logind[1458]: Session 76 logged out. Waiting for processes to exit. Jan 29 11:18:15.077854 systemd-logind[1458]: Removed session 76. Jan 29 11:18:20.247652 systemd[1]: Started sshd@79-78.46.186.225:22-147.75.109.163:35556.service - OpenSSH per-connection server daemon (147.75.109.163:35556). Jan 29 11:18:21.230151 sshd[8174]: Accepted publickey for core from 147.75.109.163 port 35556 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:21.232158 sshd-session[8174]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:21.237334 systemd-logind[1458]: New session 77 of user core. Jan 29 11:18:21.240008 systemd[1]: Started session-77.scope - Session 77 of User core. Jan 29 11:18:21.983787 sshd[8176]: Connection closed by 147.75.109.163 port 35556 Jan 29 11:18:21.985358 sshd-session[8174]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:21.989854 systemd[1]: sshd@79-78.46.186.225:22-147.75.109.163:35556.service: Deactivated successfully. Jan 29 11:18:21.992911 systemd[1]: session-77.scope: Deactivated successfully. Jan 29 11:18:21.994838 systemd-logind[1458]: Session 77 logged out. Waiting for processes to exit. Jan 29 11:18:21.996088 systemd-logind[1458]: Removed session 77. Jan 29 11:18:27.162949 systemd[1]: Started sshd@80-78.46.186.225:22-147.75.109.163:35558.service - OpenSSH per-connection server daemon (147.75.109.163:35558). Jan 29 11:18:28.154908 sshd[8211]: Accepted publickey for core from 147.75.109.163 port 35558 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:28.159297 sshd-session[8211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:28.164423 systemd-logind[1458]: New session 78 of user core. Jan 29 11:18:28.172097 systemd[1]: Started session-78.scope - Session 78 of User core. Jan 29 11:18:28.916077 sshd[8213]: Connection closed by 147.75.109.163 port 35558 Jan 29 11:18:28.916924 sshd-session[8211]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:28.923304 systemd[1]: sshd@80-78.46.186.225:22-147.75.109.163:35558.service: Deactivated successfully. Jan 29 11:18:28.926735 systemd[1]: session-78.scope: Deactivated successfully. Jan 29 11:18:28.927796 systemd-logind[1458]: Session 78 logged out. Waiting for processes to exit. Jan 29 11:18:28.928877 systemd-logind[1458]: Removed session 78. Jan 29 11:18:34.092175 systemd[1]: Started sshd@81-78.46.186.225:22-147.75.109.163:53536.service - OpenSSH per-connection server daemon (147.75.109.163:53536). Jan 29 11:18:35.087410 sshd[8224]: Accepted publickey for core from 147.75.109.163 port 53536 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:35.089381 sshd-session[8224]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:35.095515 systemd-logind[1458]: New session 79 of user core. Jan 29 11:18:35.102024 systemd[1]: Started session-79.scope - Session 79 of User core. Jan 29 11:18:35.848971 sshd[8226]: Connection closed by 147.75.109.163 port 53536 Jan 29 11:18:35.848207 sshd-session[8224]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:35.854092 systemd[1]: sshd@81-78.46.186.225:22-147.75.109.163:53536.service: Deactivated successfully. Jan 29 11:18:35.856749 systemd[1]: session-79.scope: Deactivated successfully. Jan 29 11:18:35.857882 systemd-logind[1458]: Session 79 logged out. Waiting for processes to exit. Jan 29 11:18:35.859750 systemd-logind[1458]: Removed session 79. Jan 29 11:18:41.022195 systemd[1]: Started sshd@82-78.46.186.225:22-147.75.109.163:53896.service - OpenSSH per-connection server daemon (147.75.109.163:53896). Jan 29 11:18:42.015077 sshd[8260]: Accepted publickey for core from 147.75.109.163 port 53896 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:42.017769 sshd-session[8260]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:42.022518 systemd-logind[1458]: New session 80 of user core. Jan 29 11:18:42.030211 systemd[1]: Started session-80.scope - Session 80 of User core. Jan 29 11:18:42.767087 sshd[8262]: Connection closed by 147.75.109.163 port 53896 Jan 29 11:18:42.768195 sshd-session[8260]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:42.774471 systemd-logind[1458]: Session 80 logged out. Waiting for processes to exit. Jan 29 11:18:42.775748 systemd[1]: sshd@82-78.46.186.225:22-147.75.109.163:53896.service: Deactivated successfully. Jan 29 11:18:42.778234 systemd[1]: session-80.scope: Deactivated successfully. Jan 29 11:18:42.779683 systemd-logind[1458]: Removed session 80. Jan 29 11:18:42.943168 systemd[1]: Started sshd@83-78.46.186.225:22-147.75.109.163:53910.service - OpenSSH per-connection server daemon (147.75.109.163:53910). Jan 29 11:18:43.925361 sshd[8273]: Accepted publickey for core from 147.75.109.163 port 53910 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:43.927297 sshd-session[8273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:43.933050 systemd-logind[1458]: New session 81 of user core. Jan 29 11:18:43.940097 systemd[1]: Started session-81.scope - Session 81 of User core. Jan 29 11:18:44.797885 sshd[8275]: Connection closed by 147.75.109.163 port 53910 Jan 29 11:18:44.799245 sshd-session[8273]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:44.804357 systemd[1]: sshd@83-78.46.186.225:22-147.75.109.163:53910.service: Deactivated successfully. Jan 29 11:18:44.807750 systemd[1]: session-81.scope: Deactivated successfully. Jan 29 11:18:44.808596 systemd-logind[1458]: Session 81 logged out. Waiting for processes to exit. Jan 29 11:18:44.809564 systemd-logind[1458]: Removed session 81. Jan 29 11:18:44.972289 systemd[1]: Started sshd@84-78.46.186.225:22-147.75.109.163:53914.service - OpenSSH per-connection server daemon (147.75.109.163:53914). Jan 29 11:18:45.964623 sshd[8284]: Accepted publickey for core from 147.75.109.163 port 53914 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:45.966693 sshd-session[8284]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:45.971294 systemd-logind[1458]: New session 82 of user core. Jan 29 11:18:45.980073 systemd[1]: Started session-82.scope - Session 82 of User core. Jan 29 11:18:48.633381 sshd[8286]: Connection closed by 147.75.109.163 port 53914 Jan 29 11:18:48.634971 sshd-session[8284]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:48.640427 systemd[1]: sshd@84-78.46.186.225:22-147.75.109.163:53914.service: Deactivated successfully. Jan 29 11:18:48.642424 systemd[1]: session-82.scope: Deactivated successfully. Jan 29 11:18:48.643200 systemd-logind[1458]: Session 82 logged out. Waiting for processes to exit. Jan 29 11:18:48.644077 systemd-logind[1458]: Removed session 82. Jan 29 11:18:48.813260 systemd[1]: Started sshd@85-78.46.186.225:22-147.75.109.163:38372.service - OpenSSH per-connection server daemon (147.75.109.163:38372). Jan 29 11:18:49.803632 sshd[8305]: Accepted publickey for core from 147.75.109.163 port 38372 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:49.805800 sshd-session[8305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:49.811670 systemd-logind[1458]: New session 83 of user core. Jan 29 11:18:49.817065 systemd[1]: Started session-83.scope - Session 83 of User core. Jan 29 11:18:50.707586 sshd[8307]: Connection closed by 147.75.109.163 port 38372 Jan 29 11:18:50.708385 sshd-session[8305]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:50.713378 systemd[1]: sshd@85-78.46.186.225:22-147.75.109.163:38372.service: Deactivated successfully. Jan 29 11:18:50.715552 systemd[1]: session-83.scope: Deactivated successfully. Jan 29 11:18:50.719072 systemd-logind[1458]: Session 83 logged out. Waiting for processes to exit. Jan 29 11:18:50.720211 systemd-logind[1458]: Removed session 83. Jan 29 11:18:50.884681 systemd[1]: Started sshd@86-78.46.186.225:22-147.75.109.163:38384.service - OpenSSH per-connection server daemon (147.75.109.163:38384). Jan 29 11:18:51.862041 sshd[8315]: Accepted publickey for core from 147.75.109.163 port 38384 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:51.864774 sshd-session[8315]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:51.870159 systemd-logind[1458]: New session 84 of user core. Jan 29 11:18:51.874085 systemd[1]: Started session-84.scope - Session 84 of User core. Jan 29 11:18:52.617415 sshd[8317]: Connection closed by 147.75.109.163 port 38384 Jan 29 11:18:52.618083 sshd-session[8315]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:52.623411 systemd[1]: sshd@86-78.46.186.225:22-147.75.109.163:38384.service: Deactivated successfully. Jan 29 11:18:52.626105 systemd[1]: session-84.scope: Deactivated successfully. Jan 29 11:18:52.628170 systemd-logind[1458]: Session 84 logged out. Waiting for processes to exit. Jan 29 11:18:52.629538 systemd-logind[1458]: Removed session 84. Jan 29 11:18:57.792072 systemd[1]: Started sshd@87-78.46.186.225:22-147.75.109.163:60850.service - OpenSSH per-connection server daemon (147.75.109.163:60850). Jan 29 11:18:58.797363 sshd[8351]: Accepted publickey for core from 147.75.109.163 port 60850 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:18:58.799570 sshd-session[8351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:18:58.805804 systemd-logind[1458]: New session 85 of user core. Jan 29 11:18:58.811007 systemd[1]: Started session-85.scope - Session 85 of User core. Jan 29 11:18:59.559722 sshd[8353]: Connection closed by 147.75.109.163 port 60850 Jan 29 11:18:59.560677 sshd-session[8351]: pam_unix(sshd:session): session closed for user core Jan 29 11:18:59.564740 systemd[1]: sshd@87-78.46.186.225:22-147.75.109.163:60850.service: Deactivated successfully. Jan 29 11:18:59.568018 systemd[1]: session-85.scope: Deactivated successfully. Jan 29 11:18:59.569991 systemd-logind[1458]: Session 85 logged out. Waiting for processes to exit. Jan 29 11:18:59.572700 systemd-logind[1458]: Removed session 85. Jan 29 11:19:04.736127 systemd[1]: Started sshd@88-78.46.186.225:22-147.75.109.163:60852.service - OpenSSH per-connection server daemon (147.75.109.163:60852). Jan 29 11:19:05.724650 sshd[8390]: Accepted publickey for core from 147.75.109.163 port 60852 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:05.726844 sshd-session[8390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:05.732346 systemd-logind[1458]: New session 86 of user core. Jan 29 11:19:05.739126 systemd[1]: Started session-86.scope - Session 86 of User core. Jan 29 11:19:06.482970 sshd[8392]: Connection closed by 147.75.109.163 port 60852 Jan 29 11:19:06.483890 sshd-session[8390]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:06.488272 systemd[1]: sshd@88-78.46.186.225:22-147.75.109.163:60852.service: Deactivated successfully. Jan 29 11:19:06.488346 systemd-logind[1458]: Session 86 logged out. Waiting for processes to exit. Jan 29 11:19:06.490957 systemd[1]: session-86.scope: Deactivated successfully. Jan 29 11:19:06.492383 systemd-logind[1458]: Removed session 86. Jan 29 11:19:11.656134 systemd[1]: Started sshd@89-78.46.186.225:22-147.75.109.163:47776.service - OpenSSH per-connection server daemon (147.75.109.163:47776). Jan 29 11:19:12.643868 sshd[8433]: Accepted publickey for core from 147.75.109.163 port 47776 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:12.646117 sshd-session[8433]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:12.651925 systemd-logind[1458]: New session 87 of user core. Jan 29 11:19:12.658968 systemd[1]: Started session-87.scope - Session 87 of User core. Jan 29 11:19:13.392921 sshd[8435]: Connection closed by 147.75.109.163 port 47776 Jan 29 11:19:13.393869 sshd-session[8433]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:13.399566 systemd[1]: sshd@89-78.46.186.225:22-147.75.109.163:47776.service: Deactivated successfully. Jan 29 11:19:13.402806 systemd[1]: session-87.scope: Deactivated successfully. Jan 29 11:19:13.404078 systemd-logind[1458]: Session 87 logged out. Waiting for processes to exit. Jan 29 11:19:13.405235 systemd-logind[1458]: Removed session 87. Jan 29 11:19:18.569290 systemd[1]: Started sshd@90-78.46.186.225:22-147.75.109.163:48308.service - OpenSSH per-connection server daemon (147.75.109.163:48308). Jan 29 11:19:19.555629 sshd[8447]: Accepted publickey for core from 147.75.109.163 port 48308 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:19.557752 sshd-session[8447]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:19.563426 systemd-logind[1458]: New session 88 of user core. Jan 29 11:19:19.571208 systemd[1]: Started session-88.scope - Session 88 of User core. Jan 29 11:19:20.307570 sshd[8449]: Connection closed by 147.75.109.163 port 48308 Jan 29 11:19:20.308161 sshd-session[8447]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:20.312594 systemd[1]: sshd@90-78.46.186.225:22-147.75.109.163:48308.service: Deactivated successfully. Jan 29 11:19:20.316030 systemd[1]: session-88.scope: Deactivated successfully. Jan 29 11:19:20.317438 systemd-logind[1458]: Session 88 logged out. Waiting for processes to exit. Jan 29 11:19:20.318771 systemd-logind[1458]: Removed session 88. Jan 29 11:19:25.479220 systemd[1]: Started sshd@91-78.46.186.225:22-147.75.109.163:48310.service - OpenSSH per-connection server daemon (147.75.109.163:48310). Jan 29 11:19:26.463105 sshd[8486]: Accepted publickey for core from 147.75.109.163 port 48310 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:26.464895 sshd-session[8486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:26.469681 systemd-logind[1458]: New session 89 of user core. Jan 29 11:19:26.482133 systemd[1]: Started session-89.scope - Session 89 of User core. Jan 29 11:19:27.200426 sshd[8488]: Connection closed by 147.75.109.163 port 48310 Jan 29 11:19:27.200990 sshd-session[8486]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:27.206460 systemd[1]: sshd@91-78.46.186.225:22-147.75.109.163:48310.service: Deactivated successfully. Jan 29 11:19:27.208577 systemd[1]: session-89.scope: Deactivated successfully. Jan 29 11:19:27.209969 systemd-logind[1458]: Session 89 logged out. Waiting for processes to exit. Jan 29 11:19:27.211192 systemd-logind[1458]: Removed session 89. Jan 29 11:19:32.378892 systemd[1]: Started sshd@92-78.46.186.225:22-147.75.109.163:43470.service - OpenSSH per-connection server daemon (147.75.109.163:43470). Jan 29 11:19:33.384796 sshd[8498]: Accepted publickey for core from 147.75.109.163 port 43470 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:33.386951 sshd-session[8498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:33.392747 systemd-logind[1458]: New session 90 of user core. Jan 29 11:19:33.400540 systemd[1]: Started session-90.scope - Session 90 of User core. Jan 29 11:19:34.149994 sshd[8500]: Connection closed by 147.75.109.163 port 43470 Jan 29 11:19:34.149810 sshd-session[8498]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:34.156090 systemd-logind[1458]: Session 90 logged out. Waiting for processes to exit. Jan 29 11:19:34.156490 systemd[1]: sshd@92-78.46.186.225:22-147.75.109.163:43470.service: Deactivated successfully. Jan 29 11:19:34.159613 systemd[1]: session-90.scope: Deactivated successfully. Jan 29 11:19:34.162803 systemd-logind[1458]: Removed session 90. Jan 29 11:19:39.331006 systemd[1]: Started sshd@93-78.46.186.225:22-147.75.109.163:42730.service - OpenSSH per-connection server daemon (147.75.109.163:42730). Jan 29 11:19:40.327289 sshd[8511]: Accepted publickey for core from 147.75.109.163 port 42730 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:40.329737 sshd-session[8511]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:40.337471 systemd-logind[1458]: New session 91 of user core. Jan 29 11:19:40.343114 systemd[1]: Started session-91.scope - Session 91 of User core. Jan 29 11:19:41.104705 sshd[8513]: Connection closed by 147.75.109.163 port 42730 Jan 29 11:19:41.105782 sshd-session[8511]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:41.112068 systemd[1]: sshd@93-78.46.186.225:22-147.75.109.163:42730.service: Deactivated successfully. Jan 29 11:19:41.118116 systemd[1]: session-91.scope: Deactivated successfully. Jan 29 11:19:41.120478 systemd-logind[1458]: Session 91 logged out. Waiting for processes to exit. Jan 29 11:19:41.123080 systemd-logind[1458]: Removed session 91. Jan 29 11:19:46.280403 systemd[1]: Started sshd@94-78.46.186.225:22-147.75.109.163:42740.service - OpenSSH per-connection server daemon (147.75.109.163:42740). Jan 29 11:19:47.269804 sshd[8546]: Accepted publickey for core from 147.75.109.163 port 42740 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:47.271980 sshd-session[8546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:47.278183 systemd-logind[1458]: New session 92 of user core. Jan 29 11:19:47.281403 systemd[1]: Started session-92.scope - Session 92 of User core. Jan 29 11:19:48.030949 sshd[8548]: Connection closed by 147.75.109.163 port 42740 Jan 29 11:19:48.032405 sshd-session[8546]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:48.039237 systemd[1]: sshd@94-78.46.186.225:22-147.75.109.163:42740.service: Deactivated successfully. Jan 29 11:19:48.041654 systemd[1]: session-92.scope: Deactivated successfully. Jan 29 11:19:48.042545 systemd-logind[1458]: Session 92 logged out. Waiting for processes to exit. Jan 29 11:19:48.043628 systemd-logind[1458]: Removed session 92. Jan 29 11:19:53.211379 systemd[1]: Started sshd@95-78.46.186.225:22-147.75.109.163:55690.service - OpenSSH per-connection server daemon (147.75.109.163:55690). Jan 29 11:19:54.190940 sshd[8581]: Accepted publickey for core from 147.75.109.163 port 55690 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:19:54.192796 sshd-session[8581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:19:54.197428 systemd-logind[1458]: New session 93 of user core. Jan 29 11:19:54.203055 systemd[1]: Started session-93.scope - Session 93 of User core. Jan 29 11:19:54.942597 sshd[8583]: Connection closed by 147.75.109.163 port 55690 Jan 29 11:19:54.942457 sshd-session[8581]: pam_unix(sshd:session): session closed for user core Jan 29 11:19:54.946927 systemd[1]: sshd@95-78.46.186.225:22-147.75.109.163:55690.service: Deactivated successfully. Jan 29 11:19:54.950617 systemd[1]: session-93.scope: Deactivated successfully. Jan 29 11:19:54.952983 systemd-logind[1458]: Session 93 logged out. Waiting for processes to exit. Jan 29 11:19:54.954291 systemd-logind[1458]: Removed session 93. Jan 29 11:20:00.122263 systemd[1]: Started sshd@96-78.46.186.225:22-147.75.109.163:50770.service - OpenSSH per-connection server daemon (147.75.109.163:50770). Jan 29 11:20:01.128447 sshd[8596]: Accepted publickey for core from 147.75.109.163 port 50770 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:01.130986 sshd-session[8596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:01.137929 systemd-logind[1458]: New session 94 of user core. Jan 29 11:20:01.145157 systemd[1]: Started session-94.scope - Session 94 of User core. Jan 29 11:20:01.894535 sshd[8598]: Connection closed by 147.75.109.163 port 50770 Jan 29 11:20:01.893453 sshd-session[8596]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:01.898658 systemd[1]: sshd@96-78.46.186.225:22-147.75.109.163:50770.service: Deactivated successfully. Jan 29 11:20:01.901187 systemd[1]: session-94.scope: Deactivated successfully. Jan 29 11:20:01.905421 systemd-logind[1458]: Session 94 logged out. Waiting for processes to exit. Jan 29 11:20:01.908454 systemd-logind[1458]: Removed session 94. Jan 29 11:20:07.072118 systemd[1]: Started sshd@97-78.46.186.225:22-147.75.109.163:50786.service - OpenSSH per-connection server daemon (147.75.109.163:50786). Jan 29 11:20:08.078002 sshd[8635]: Accepted publickey for core from 147.75.109.163 port 50786 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:08.079989 sshd-session[8635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:08.085392 systemd-logind[1458]: New session 95 of user core. Jan 29 11:20:08.092032 systemd[1]: Started session-95.scope - Session 95 of User core. Jan 29 11:20:08.844897 sshd[8637]: Connection closed by 147.75.109.163 port 50786 Jan 29 11:20:08.845629 sshd-session[8635]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:08.850357 systemd[1]: sshd@97-78.46.186.225:22-147.75.109.163:50786.service: Deactivated successfully. Jan 29 11:20:08.852857 systemd[1]: session-95.scope: Deactivated successfully. Jan 29 11:20:08.854265 systemd-logind[1458]: Session 95 logged out. Waiting for processes to exit. Jan 29 11:20:08.856484 systemd-logind[1458]: Removed session 95. Jan 29 11:20:13.717404 systemd[1]: Started sshd@98-78.46.186.225:22-80.94.95.112:51827.service - OpenSSH per-connection server daemon (80.94.95.112:51827). Jan 29 11:20:13.908034 sshd[8667]: Invalid user daniel from 80.94.95.112 port 51827 Jan 29 11:20:14.025215 systemd[1]: Started sshd@99-78.46.186.225:22-147.75.109.163:35818.service - OpenSSH per-connection server daemon (147.75.109.163:35818). Jan 29 11:20:14.070508 sshd[8667]: Received disconnect from 80.94.95.112 port 51827:11: Bye [preauth] Jan 29 11:20:14.070880 sshd[8667]: Disconnected from invalid user daniel 80.94.95.112 port 51827 [preauth] Jan 29 11:20:14.073363 systemd[1]: sshd@98-78.46.186.225:22-80.94.95.112:51827.service: Deactivated successfully. Jan 29 11:20:15.021984 sshd[8671]: Accepted publickey for core from 147.75.109.163 port 35818 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:15.024512 sshd-session[8671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:15.030852 systemd-logind[1458]: New session 96 of user core. Jan 29 11:20:15.036129 systemd[1]: Started session-96.scope - Session 96 of User core. Jan 29 11:20:15.792900 sshd[8675]: Connection closed by 147.75.109.163 port 35818 Jan 29 11:20:15.793984 sshd-session[8671]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:15.800467 systemd[1]: sshd@99-78.46.186.225:22-147.75.109.163:35818.service: Deactivated successfully. Jan 29 11:20:15.800508 systemd-logind[1458]: Session 96 logged out. Waiting for processes to exit. Jan 29 11:20:15.803560 systemd[1]: session-96.scope: Deactivated successfully. Jan 29 11:20:15.804767 systemd-logind[1458]: Removed session 96. Jan 29 11:20:20.974190 systemd[1]: Started sshd@100-78.46.186.225:22-147.75.109.163:33206.service - OpenSSH per-connection server daemon (147.75.109.163:33206). Jan 29 11:20:21.947549 sshd[8685]: Accepted publickey for core from 147.75.109.163 port 33206 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:21.950129 sshd-session[8685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:21.957219 systemd-logind[1458]: New session 97 of user core. Jan 29 11:20:21.963168 systemd[1]: Started session-97.scope - Session 97 of User core. Jan 29 11:20:22.701864 sshd[8687]: Connection closed by 147.75.109.163 port 33206 Jan 29 11:20:22.703581 sshd-session[8685]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:22.708302 systemd[1]: sshd@100-78.46.186.225:22-147.75.109.163:33206.service: Deactivated successfully. Jan 29 11:20:22.710479 systemd[1]: session-97.scope: Deactivated successfully. Jan 29 11:20:22.712192 systemd-logind[1458]: Session 97 logged out. Waiting for processes to exit. Jan 29 11:20:22.713436 systemd-logind[1458]: Removed session 97. Jan 29 11:20:27.880208 systemd[1]: Started sshd@101-78.46.186.225:22-147.75.109.163:54876.service - OpenSSH per-connection server daemon (147.75.109.163:54876). Jan 29 11:20:28.877500 sshd[8721]: Accepted publickey for core from 147.75.109.163 port 54876 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:28.880030 sshd-session[8721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:28.886436 systemd-logind[1458]: New session 98 of user core. Jan 29 11:20:28.892124 systemd[1]: Started session-98.scope - Session 98 of User core. Jan 29 11:20:29.626613 sshd[8723]: Connection closed by 147.75.109.163 port 54876 Jan 29 11:20:29.628014 sshd-session[8721]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:29.633723 systemd[1]: sshd@101-78.46.186.225:22-147.75.109.163:54876.service: Deactivated successfully. Jan 29 11:20:29.637505 systemd[1]: session-98.scope: Deactivated successfully. Jan 29 11:20:29.638795 systemd-logind[1458]: Session 98 logged out. Waiting for processes to exit. Jan 29 11:20:29.640453 systemd-logind[1458]: Removed session 98. Jan 29 11:20:30.519145 systemd[1]: Started sshd@102-78.46.186.225:22-92.255.85.188:63280.service - OpenSSH per-connection server daemon (92.255.85.188:63280). Jan 29 11:20:31.491202 sshd[8734]: Invalid user admin123 from 92.255.85.188 port 63280 Jan 29 11:20:31.558883 sshd[8734]: Connection closed by invalid user admin123 92.255.85.188 port 63280 [preauth] Jan 29 11:20:31.562249 systemd[1]: sshd@102-78.46.186.225:22-92.255.85.188:63280.service: Deactivated successfully. Jan 29 11:20:34.803625 systemd[1]: Started sshd@103-78.46.186.225:22-147.75.109.163:54890.service - OpenSSH per-connection server daemon (147.75.109.163:54890). Jan 29 11:20:35.796890 sshd[8744]: Accepted publickey for core from 147.75.109.163 port 54890 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:35.799432 sshd-session[8744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:35.805343 systemd-logind[1458]: New session 99 of user core. Jan 29 11:20:35.809985 systemd[1]: Started session-99.scope - Session 99 of User core. Jan 29 11:20:36.553513 sshd[8746]: Connection closed by 147.75.109.163 port 54890 Jan 29 11:20:36.554423 sshd-session[8744]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:36.560931 systemd[1]: sshd@103-78.46.186.225:22-147.75.109.163:54890.service: Deactivated successfully. Jan 29 11:20:36.563919 systemd[1]: session-99.scope: Deactivated successfully. Jan 29 11:20:36.565163 systemd-logind[1458]: Session 99 logged out. Waiting for processes to exit. Jan 29 11:20:36.566255 systemd-logind[1458]: Removed session 99. Jan 29 11:20:41.731273 systemd[1]: Started sshd@104-78.46.186.225:22-147.75.109.163:50136.service - OpenSSH per-connection server daemon (147.75.109.163:50136). Jan 29 11:20:42.731008 sshd[8790]: Accepted publickey for core from 147.75.109.163 port 50136 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:42.731721 sshd-session[8790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:42.736904 systemd-logind[1458]: New session 100 of user core. Jan 29 11:20:42.742981 systemd[1]: Started session-100.scope - Session 100 of User core. Jan 29 11:20:43.486747 sshd[8792]: Connection closed by 147.75.109.163 port 50136 Jan 29 11:20:43.486616 sshd-session[8790]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:43.492587 systemd[1]: sshd@104-78.46.186.225:22-147.75.109.163:50136.service: Deactivated successfully. Jan 29 11:20:43.495144 systemd[1]: session-100.scope: Deactivated successfully. Jan 29 11:20:43.496092 systemd-logind[1458]: Session 100 logged out. Waiting for processes to exit. Jan 29 11:20:43.498469 systemd-logind[1458]: Removed session 100. Jan 29 11:20:48.659942 systemd[1]: Started sshd@105-78.46.186.225:22-147.75.109.163:39940.service - OpenSSH per-connection server daemon (147.75.109.163:39940). Jan 29 11:20:49.652752 sshd[8802]: Accepted publickey for core from 147.75.109.163 port 39940 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:49.654871 sshd-session[8802]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:49.660259 systemd-logind[1458]: New session 101 of user core. Jan 29 11:20:49.668151 systemd[1]: Started session-101.scope - Session 101 of User core. Jan 29 11:20:50.409910 sshd[8804]: Connection closed by 147.75.109.163 port 39940 Jan 29 11:20:50.410782 sshd-session[8802]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:50.415850 systemd[1]: sshd@105-78.46.186.225:22-147.75.109.163:39940.service: Deactivated successfully. Jan 29 11:20:50.419435 systemd[1]: session-101.scope: Deactivated successfully. Jan 29 11:20:50.421076 systemd-logind[1458]: Session 101 logged out. Waiting for processes to exit. Jan 29 11:20:50.422132 systemd-logind[1458]: Removed session 101. Jan 29 11:20:55.590397 systemd[1]: Started sshd@106-78.46.186.225:22-147.75.109.163:39942.service - OpenSSH per-connection server daemon (147.75.109.163:39942). Jan 29 11:20:56.586790 sshd[8837]: Accepted publickey for core from 147.75.109.163 port 39942 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:20:56.588913 sshd-session[8837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:20:56.593511 systemd-logind[1458]: New session 102 of user core. Jan 29 11:20:56.600106 systemd[1]: Started session-102.scope - Session 102 of User core. Jan 29 11:20:57.351198 sshd[8839]: Connection closed by 147.75.109.163 port 39942 Jan 29 11:20:57.351090 sshd-session[8837]: pam_unix(sshd:session): session closed for user core Jan 29 11:20:57.356661 systemd[1]: sshd@106-78.46.186.225:22-147.75.109.163:39942.service: Deactivated successfully. Jan 29 11:20:57.358941 systemd[1]: session-102.scope: Deactivated successfully. Jan 29 11:20:57.359894 systemd-logind[1458]: Session 102 logged out. Waiting for processes to exit. Jan 29 11:20:57.361516 systemd-logind[1458]: Removed session 102. Jan 29 11:21:02.527321 systemd[1]: Started sshd@107-78.46.186.225:22-147.75.109.163:53576.service - OpenSSH per-connection server daemon (147.75.109.163:53576). Jan 29 11:21:03.527180 sshd[8868]: Accepted publickey for core from 147.75.109.163 port 53576 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:03.528256 sshd-session[8868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:03.533608 systemd-logind[1458]: New session 103 of user core. Jan 29 11:21:03.539134 systemd[1]: Started session-103.scope - Session 103 of User core. Jan 29 11:21:04.295041 sshd[8870]: Connection closed by 147.75.109.163 port 53576 Jan 29 11:21:04.294934 sshd-session[8868]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:04.298906 systemd-logind[1458]: Session 103 logged out. Waiting for processes to exit. Jan 29 11:21:04.299174 systemd[1]: sshd@107-78.46.186.225:22-147.75.109.163:53576.service: Deactivated successfully. Jan 29 11:21:04.302312 systemd[1]: session-103.scope: Deactivated successfully. Jan 29 11:21:04.304512 systemd-logind[1458]: Removed session 103. Jan 29 11:21:09.469330 systemd[1]: Started sshd@108-78.46.186.225:22-147.75.109.163:44904.service - OpenSSH per-connection server daemon (147.75.109.163:44904). Jan 29 11:21:10.441833 sshd[8880]: Accepted publickey for core from 147.75.109.163 port 44904 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:10.443310 sshd-session[8880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:10.448788 systemd-logind[1458]: New session 104 of user core. Jan 29 11:21:10.454159 systemd[1]: Started session-104.scope - Session 104 of User core. Jan 29 11:21:11.187507 sshd[8882]: Connection closed by 147.75.109.163 port 44904 Jan 29 11:21:11.188638 sshd-session[8880]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:11.195217 systemd[1]: sshd@108-78.46.186.225:22-147.75.109.163:44904.service: Deactivated successfully. Jan 29 11:21:11.199085 systemd[1]: session-104.scope: Deactivated successfully. Jan 29 11:21:11.199924 systemd-logind[1458]: Session 104 logged out. Waiting for processes to exit. Jan 29 11:21:11.201757 systemd-logind[1458]: Removed session 104. Jan 29 11:21:16.363123 systemd[1]: Started sshd@109-78.46.186.225:22-147.75.109.163:44912.service - OpenSSH per-connection server daemon (147.75.109.163:44912). Jan 29 11:21:17.344904 sshd[8913]: Accepted publickey for core from 147.75.109.163 port 44912 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:17.345670 sshd-session[8913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:17.351010 systemd-logind[1458]: New session 105 of user core. Jan 29 11:21:17.358095 systemd[1]: Started session-105.scope - Session 105 of User core. Jan 29 11:21:18.093894 sshd[8915]: Connection closed by 147.75.109.163 port 44912 Jan 29 11:21:18.094666 sshd-session[8913]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:18.098094 systemd-logind[1458]: Session 105 logged out. Waiting for processes to exit. Jan 29 11:21:18.100067 systemd[1]: sshd@109-78.46.186.225:22-147.75.109.163:44912.service: Deactivated successfully. Jan 29 11:21:18.102787 systemd[1]: session-105.scope: Deactivated successfully. Jan 29 11:21:18.104239 systemd-logind[1458]: Removed session 105. Jan 29 11:21:23.274178 systemd[1]: Started sshd@110-78.46.186.225:22-147.75.109.163:38146.service - OpenSSH per-connection server daemon (147.75.109.163:38146). Jan 29 11:21:24.287154 sshd[8947]: Accepted publickey for core from 147.75.109.163 port 38146 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:24.289905 sshd-session[8947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:24.296698 systemd-logind[1458]: New session 106 of user core. Jan 29 11:21:24.301032 systemd[1]: Started session-106.scope - Session 106 of User core. Jan 29 11:21:25.044596 sshd[8949]: Connection closed by 147.75.109.163 port 38146 Jan 29 11:21:25.045857 sshd-session[8947]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:25.050009 systemd[1]: sshd@110-78.46.186.225:22-147.75.109.163:38146.service: Deactivated successfully. Jan 29 11:21:25.052044 systemd[1]: session-106.scope: Deactivated successfully. Jan 29 11:21:25.053998 systemd-logind[1458]: Session 106 logged out. Waiting for processes to exit. Jan 29 11:21:25.056641 systemd-logind[1458]: Removed session 106. Jan 29 11:21:30.220099 systemd[1]: Started sshd@111-78.46.186.225:22-147.75.109.163:46592.service - OpenSSH per-connection server daemon (147.75.109.163:46592). Jan 29 11:21:31.208473 sshd[8962]: Accepted publickey for core from 147.75.109.163 port 46592 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:31.211486 sshd-session[8962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:31.218059 systemd-logind[1458]: New session 107 of user core. Jan 29 11:21:31.224074 systemd[1]: Started session-107.scope - Session 107 of User core. Jan 29 11:21:31.957761 sshd[8964]: Connection closed by 147.75.109.163 port 46592 Jan 29 11:21:31.957614 sshd-session[8962]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:31.964187 systemd[1]: sshd@111-78.46.186.225:22-147.75.109.163:46592.service: Deactivated successfully. Jan 29 11:21:31.968504 systemd[1]: session-107.scope: Deactivated successfully. Jan 29 11:21:31.969671 systemd-logind[1458]: Session 107 logged out. Waiting for processes to exit. Jan 29 11:21:31.971327 systemd-logind[1458]: Removed session 107. Jan 29 11:21:37.135234 systemd[1]: Started sshd@112-78.46.186.225:22-147.75.109.163:46604.service - OpenSSH per-connection server daemon (147.75.109.163:46604). Jan 29 11:21:38.125516 sshd[8975]: Accepted publickey for core from 147.75.109.163 port 46604 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:38.128089 sshd-session[8975]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:38.133678 systemd-logind[1458]: New session 108 of user core. Jan 29 11:21:38.143164 systemd[1]: Started session-108.scope - Session 108 of User core. Jan 29 11:21:38.884923 sshd[8977]: Connection closed by 147.75.109.163 port 46604 Jan 29 11:21:38.885932 sshd-session[8975]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:38.892799 systemd[1]: sshd@112-78.46.186.225:22-147.75.109.163:46604.service: Deactivated successfully. Jan 29 11:21:38.893159 systemd-logind[1458]: Session 108 logged out. Waiting for processes to exit. Jan 29 11:21:38.896420 systemd[1]: session-108.scope: Deactivated successfully. Jan 29 11:21:38.897998 systemd-logind[1458]: Removed session 108. Jan 29 11:21:44.058038 systemd[1]: Started sshd@113-78.46.186.225:22-147.75.109.163:41030.service - OpenSSH per-connection server daemon (147.75.109.163:41030). Jan 29 11:21:45.051341 sshd[9009]: Accepted publickey for core from 147.75.109.163 port 41030 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:45.053759 sshd-session[9009]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:45.058632 systemd-logind[1458]: New session 109 of user core. Jan 29 11:21:45.063977 systemd[1]: Started session-109.scope - Session 109 of User core. Jan 29 11:21:45.811000 sshd[9011]: Connection closed by 147.75.109.163 port 41030 Jan 29 11:21:45.812217 sshd-session[9009]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:45.817289 systemd-logind[1458]: Session 109 logged out. Waiting for processes to exit. Jan 29 11:21:45.818145 systemd[1]: sshd@113-78.46.186.225:22-147.75.109.163:41030.service: Deactivated successfully. Jan 29 11:21:45.820980 systemd[1]: session-109.scope: Deactivated successfully. Jan 29 11:21:45.822399 systemd-logind[1458]: Removed session 109. Jan 29 11:21:50.989302 systemd[1]: Started sshd@114-78.46.186.225:22-147.75.109.163:34724.service - OpenSSH per-connection server daemon (147.75.109.163:34724). Jan 29 11:21:51.977056 sshd[9023]: Accepted publickey for core from 147.75.109.163 port 34724 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:51.978941 sshd-session[9023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:51.983854 systemd-logind[1458]: New session 110 of user core. Jan 29 11:21:51.989098 systemd[1]: Started session-110.scope - Session 110 of User core. Jan 29 11:21:52.730934 sshd[9025]: Connection closed by 147.75.109.163 port 34724 Jan 29 11:21:52.731569 sshd-session[9023]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:52.736245 systemd[1]: sshd@114-78.46.186.225:22-147.75.109.163:34724.service: Deactivated successfully. Jan 29 11:21:52.739336 systemd[1]: session-110.scope: Deactivated successfully. Jan 29 11:21:52.742246 systemd-logind[1458]: Session 110 logged out. Waiting for processes to exit. Jan 29 11:21:52.744520 systemd-logind[1458]: Removed session 110. Jan 29 11:21:57.907131 systemd[1]: Started sshd@115-78.46.186.225:22-147.75.109.163:37130.service - OpenSSH per-connection server daemon (147.75.109.163:37130). Jan 29 11:21:58.889375 sshd[9060]: Accepted publickey for core from 147.75.109.163 port 37130 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:21:58.891341 sshd-session[9060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:21:58.896193 systemd-logind[1458]: New session 111 of user core. Jan 29 11:21:58.902042 systemd[1]: Started session-111.scope - Session 111 of User core. Jan 29 11:21:59.641534 sshd[9062]: Connection closed by 147.75.109.163 port 37130 Jan 29 11:21:59.642217 sshd-session[9060]: pam_unix(sshd:session): session closed for user core Jan 29 11:21:59.649489 systemd-logind[1458]: Session 111 logged out. Waiting for processes to exit. Jan 29 11:21:59.650151 systemd[1]: sshd@115-78.46.186.225:22-147.75.109.163:37130.service: Deactivated successfully. Jan 29 11:21:59.652454 systemd[1]: session-111.scope: Deactivated successfully. Jan 29 11:21:59.654353 systemd-logind[1458]: Removed session 111. Jan 29 11:22:04.825308 systemd[1]: Started sshd@116-78.46.186.225:22-147.75.109.163:37146.service - OpenSSH per-connection server daemon (147.75.109.163:37146). Jan 29 11:22:05.820153 sshd[9095]: Accepted publickey for core from 147.75.109.163 port 37146 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:05.823201 sshd-session[9095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:05.832236 systemd-logind[1458]: New session 112 of user core. Jan 29 11:22:05.838187 systemd[1]: Started session-112.scope - Session 112 of User core. Jan 29 11:22:06.590777 sshd[9097]: Connection closed by 147.75.109.163 port 37146 Jan 29 11:22:06.591646 sshd-session[9095]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:06.596713 systemd[1]: sshd@116-78.46.186.225:22-147.75.109.163:37146.service: Deactivated successfully. Jan 29 11:22:06.600577 systemd[1]: session-112.scope: Deactivated successfully. Jan 29 11:22:06.602688 systemd-logind[1458]: Session 112 logged out. Waiting for processes to exit. Jan 29 11:22:06.603960 systemd-logind[1458]: Removed session 112. Jan 29 11:22:10.766694 systemd[1]: run-containerd-runc-k8s.io-717f852657acf4b1191f7acaf67a9f4da83a386dfd9b14c5a1a2691f4b42a111-runc.BeDZyU.mount: Deactivated successfully. Jan 29 11:22:11.772235 systemd[1]: Started sshd@117-78.46.186.225:22-147.75.109.163:47700.service - OpenSSH per-connection server daemon (147.75.109.163:47700). Jan 29 11:22:12.780500 sshd[9127]: Accepted publickey for core from 147.75.109.163 port 47700 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:12.782790 sshd-session[9127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:12.788708 systemd-logind[1458]: New session 113 of user core. Jan 29 11:22:12.792055 systemd[1]: Started session-113.scope - Session 113 of User core. Jan 29 11:22:13.544918 sshd[9129]: Connection closed by 147.75.109.163 port 47700 Jan 29 11:22:13.545933 sshd-session[9127]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:13.551698 systemd[1]: sshd@117-78.46.186.225:22-147.75.109.163:47700.service: Deactivated successfully. Jan 29 11:22:13.554366 systemd[1]: session-113.scope: Deactivated successfully. Jan 29 11:22:13.555773 systemd-logind[1458]: Session 113 logged out. Waiting for processes to exit. Jan 29 11:22:13.557706 systemd-logind[1458]: Removed session 113. Jan 29 11:22:18.726246 systemd[1]: Started sshd@118-78.46.186.225:22-147.75.109.163:40542.service - OpenSSH per-connection server daemon (147.75.109.163:40542). Jan 29 11:22:19.726217 sshd[9152]: Accepted publickey for core from 147.75.109.163 port 40542 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:19.729244 sshd-session[9152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:19.736021 systemd-logind[1458]: New session 114 of user core. Jan 29 11:22:19.741089 systemd[1]: Started session-114.scope - Session 114 of User core. Jan 29 11:22:20.487350 sshd[9154]: Connection closed by 147.75.109.163 port 40542 Jan 29 11:22:20.488233 sshd-session[9152]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:20.493984 systemd[1]: sshd@118-78.46.186.225:22-147.75.109.163:40542.service: Deactivated successfully. Jan 29 11:22:20.496570 systemd[1]: session-114.scope: Deactivated successfully. Jan 29 11:22:20.497539 systemd-logind[1458]: Session 114 logged out. Waiting for processes to exit. Jan 29 11:22:20.499156 systemd-logind[1458]: Removed session 114. Jan 29 11:22:25.663234 systemd[1]: Started sshd@119-78.46.186.225:22-147.75.109.163:40544.service - OpenSSH per-connection server daemon (147.75.109.163:40544). Jan 29 11:22:26.659841 sshd[9190]: Accepted publickey for core from 147.75.109.163 port 40544 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:26.661343 sshd-session[9190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:26.666363 systemd-logind[1458]: New session 115 of user core. Jan 29 11:22:26.670225 systemd[1]: Started session-115.scope - Session 115 of User core. Jan 29 11:22:27.406957 sshd[9192]: Connection closed by 147.75.109.163 port 40544 Jan 29 11:22:27.408022 sshd-session[9190]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:27.412781 systemd[1]: sshd@119-78.46.186.225:22-147.75.109.163:40544.service: Deactivated successfully. Jan 29 11:22:27.416426 systemd[1]: session-115.scope: Deactivated successfully. Jan 29 11:22:27.420087 systemd-logind[1458]: Session 115 logged out. Waiting for processes to exit. Jan 29 11:22:27.421861 systemd-logind[1458]: Removed session 115. Jan 29 11:22:32.584262 systemd[1]: Started sshd@120-78.46.186.225:22-147.75.109.163:44434.service - OpenSSH per-connection server daemon (147.75.109.163:44434). Jan 29 11:22:33.582524 sshd[9203]: Accepted publickey for core from 147.75.109.163 port 44434 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:33.584266 sshd-session[9203]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:33.589185 systemd-logind[1458]: New session 116 of user core. Jan 29 11:22:33.603326 systemd[1]: Started session-116.scope - Session 116 of User core. Jan 29 11:22:34.342622 sshd[9205]: Connection closed by 147.75.109.163 port 44434 Jan 29 11:22:34.343719 sshd-session[9203]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:34.348501 systemd[1]: sshd@120-78.46.186.225:22-147.75.109.163:44434.service: Deactivated successfully. Jan 29 11:22:34.348525 systemd-logind[1458]: Session 116 logged out. Waiting for processes to exit. Jan 29 11:22:34.351355 systemd[1]: session-116.scope: Deactivated successfully. Jan 29 11:22:34.353609 systemd-logind[1458]: Removed session 116. Jan 29 11:22:39.517271 systemd[1]: Started sshd@121-78.46.186.225:22-147.75.109.163:37140.service - OpenSSH per-connection server daemon (147.75.109.163:37140). Jan 29 11:22:40.506674 sshd[9216]: Accepted publickey for core from 147.75.109.163 port 37140 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:40.508673 sshd-session[9216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:40.514168 systemd-logind[1458]: New session 117 of user core. Jan 29 11:22:40.519246 systemd[1]: Started session-117.scope - Session 117 of User core. Jan 29 11:22:41.290747 sshd[9218]: Connection closed by 147.75.109.163 port 37140 Jan 29 11:22:41.292222 sshd-session[9216]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:41.299485 systemd[1]: sshd@121-78.46.186.225:22-147.75.109.163:37140.service: Deactivated successfully. Jan 29 11:22:41.303799 systemd[1]: session-117.scope: Deactivated successfully. Jan 29 11:22:41.308903 systemd-logind[1458]: Session 117 logged out. Waiting for processes to exit. Jan 29 11:22:41.311402 systemd-logind[1458]: Removed session 117. Jan 29 11:22:46.469179 systemd[1]: Started sshd@122-78.46.186.225:22-147.75.109.163:37154.service - OpenSSH per-connection server daemon (147.75.109.163:37154). Jan 29 11:22:47.471466 sshd[9250]: Accepted publickey for core from 147.75.109.163 port 37154 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:47.473394 sshd-session[9250]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:47.478553 systemd-logind[1458]: New session 118 of user core. Jan 29 11:22:47.483080 systemd[1]: Started session-118.scope - Session 118 of User core. Jan 29 11:22:48.231884 sshd[9252]: Connection closed by 147.75.109.163 port 37154 Jan 29 11:22:48.232144 sshd-session[9250]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:48.236281 systemd[1]: sshd@122-78.46.186.225:22-147.75.109.163:37154.service: Deactivated successfully. Jan 29 11:22:48.239014 systemd[1]: session-118.scope: Deactivated successfully. Jan 29 11:22:48.241307 systemd-logind[1458]: Session 118 logged out. Waiting for processes to exit. Jan 29 11:22:48.243155 systemd-logind[1458]: Removed session 118. Jan 29 11:22:53.406239 systemd[1]: Started sshd@123-78.46.186.225:22-147.75.109.163:41786.service - OpenSSH per-connection server daemon (147.75.109.163:41786). Jan 29 11:22:54.385774 sshd[9286]: Accepted publickey for core from 147.75.109.163 port 41786 ssh2: RSA SHA256:nclG6x2+CCPDg1J87dfSmoG85ir0BMjvhJKqcua3Jmo Jan 29 11:22:54.387642 sshd-session[9286]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 11:22:54.393524 systemd-logind[1458]: New session 119 of user core. Jan 29 11:22:54.399188 systemd[1]: Started session-119.scope - Session 119 of User core. Jan 29 11:22:55.138808 sshd[9288]: Connection closed by 147.75.109.163 port 41786 Jan 29 11:22:55.138332 sshd-session[9286]: pam_unix(sshd:session): session closed for user core Jan 29 11:22:55.141529 systemd[1]: sshd@123-78.46.186.225:22-147.75.109.163:41786.service: Deactivated successfully. Jan 29 11:22:55.145764 systemd[1]: session-119.scope: Deactivated successfully. Jan 29 11:22:55.149370 systemd-logind[1458]: Session 119 logged out. Waiting for processes to exit. Jan 29 11:22:55.150563 systemd-logind[1458]: Removed session 119. Jan 29 11:23:03.524143 systemd[1]: Started sshd@124-78.46.186.225:22-195.178.110.65:54888.service - OpenSSH per-connection server daemon (195.178.110.65:54888). Jan 29 11:23:03.597873 sshd[9320]: Invalid user node from 195.178.110.65 port 54888 Jan 29 11:23:03.609792 sshd[9320]: Connection closed by invalid user node 195.178.110.65 port 54888 [preauth] Jan 29 11:23:03.613694 systemd[1]: sshd@124-78.46.186.225:22-195.178.110.65:54888.service: Deactivated successfully. Jan 29 11:23:11.247886 systemd[1]: cri-containerd-d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711.scope: Deactivated successfully. Jan 29 11:23:11.248164 systemd[1]: cri-containerd-d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711.scope: Consumed 10.861s CPU time. Jan 29 11:23:11.273851 containerd[1478]: time="2025-01-29T11:23:11.272104737Z" level=info msg="shim disconnected" id=d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711 namespace=k8s.io Jan 29 11:23:11.273851 containerd[1478]: time="2025-01-29T11:23:11.272161377Z" level=warning msg="cleaning up after shim disconnected" id=d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711 namespace=k8s.io Jan 29 11:23:11.273851 containerd[1478]: time="2025-01-29T11:23:11.272169537Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 11:23:11.273745 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711-rootfs.mount: Deactivated successfully. Jan 29 11:23:11.468120 systemd[1]: cri-containerd-1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4.scope: Deactivated successfully. Jan 29 11:23:11.468446 systemd[1]: cri-containerd-1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4.scope: Consumed 4.355s CPU time, 16.0M memory peak, 0B memory swap peak. Jan 29 11:23:11.471153 kubelet[2815]: E0129 11:23:11.470253 2815 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:51026->10.0.0.2:2379: read: connection timed out" Jan 29 11:23:11.493330 containerd[1478]: time="2025-01-29T11:23:11.493234191Z" level=info msg="shim disconnected" id=1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4 namespace=k8s.io Jan 29 11:23:11.493330 containerd[1478]: time="2025-01-29T11:23:11.493285392Z" level=warning msg="cleaning up after shim disconnected" id=1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4 namespace=k8s.io Jan 29 11:23:11.493330 containerd[1478]: time="2025-01-29T11:23:11.493294112Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 11:23:11.494715 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4-rootfs.mount: Deactivated successfully. Jan 29 11:23:11.545873 systemd[1]: cri-containerd-474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b.scope: Deactivated successfully. Jan 29 11:23:11.548204 systemd[1]: cri-containerd-474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b.scope: Consumed 14.655s CPU time, 21.7M memory peak, 0B memory swap peak. Jan 29 11:23:11.576254 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b-rootfs.mount: Deactivated successfully. Jan 29 11:23:11.577031 containerd[1478]: time="2025-01-29T11:23:11.576323968Z" level=info msg="shim disconnected" id=474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b namespace=k8s.io Jan 29 11:23:11.577031 containerd[1478]: time="2025-01-29T11:23:11.576401689Z" level=warning msg="cleaning up after shim disconnected" id=474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b namespace=k8s.io Jan 29 11:23:11.577031 containerd[1478]: time="2025-01-29T11:23:11.576410729Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 11:23:11.900973 kubelet[2815]: I0129 11:23:11.900868 2815 scope.go:117] "RemoveContainer" containerID="d61e002a0cd93524fe8e2e1fd29882dedc8031bdde0630f587259c219b886711" Jan 29 11:23:11.905103 kubelet[2815]: I0129 11:23:11.905074 2815 scope.go:117] "RemoveContainer" containerID="474c243b4315e19138567a07b9c1087041765728dbc846c2e524e86366d6986b" Jan 29 11:23:11.908149 kubelet[2815]: I0129 11:23:11.908127 2815 scope.go:117] "RemoveContainer" containerID="1fb358a3bcfedb1b677604b9c4b0724566b1bbcd17c35404ce9fca34c16db0c4" Jan 29 11:23:11.908758 containerd[1478]: time="2025-01-29T11:23:11.908695515Z" level=info msg="CreateContainer within sandbox \"a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 29 11:23:11.911562 containerd[1478]: time="2025-01-29T11:23:11.911424774Z" level=info msg="CreateContainer within sandbox \"6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 29 11:23:11.916035 containerd[1478]: time="2025-01-29T11:23:11.916005325Z" level=info msg="CreateContainer within sandbox \"bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 29 11:23:11.928644 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3766968733.mount: Deactivated successfully. Jan 29 11:23:11.940979 containerd[1478]: time="2025-01-29T11:23:11.940933858Z" level=info msg="CreateContainer within sandbox \"a147ed667b44b61d760ac5131f6256f90b759dcab00e1b7fee88475d8efbd572\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"1219c5a8ac78f5c7e76c91096c43c1b3c8380eeda4e645e6a369487b061f2ea0\"" Jan 29 11:23:11.942313 containerd[1478]: time="2025-01-29T11:23:11.941593143Z" level=info msg="CreateContainer within sandbox \"6c712e10d83985c764ea5dc2bad3a714af9927c747e69d827c1178a3ff2eccba\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"0d27cf9f54d5f588debedbd082f9af77ca1472e76fccfc61301c2f8c491e545b\"" Jan 29 11:23:11.942571 containerd[1478]: time="2025-01-29T11:23:11.942480709Z" level=info msg="StartContainer for \"1219c5a8ac78f5c7e76c91096c43c1b3c8380eeda4e645e6a369487b061f2ea0\"" Jan 29 11:23:11.944091 containerd[1478]: time="2025-01-29T11:23:11.942968593Z" level=info msg="CreateContainer within sandbox \"bf827e428d867183a0e0becc360b80c02e87bca5aec5c57f0033b186440f6c1d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"6963e5073f3e84da0aa096f485258f6318a2ed281f145ff686c1b1cedf8e072d\"" Jan 29 11:23:11.944091 containerd[1478]: time="2025-01-29T11:23:11.943113114Z" level=info msg="StartContainer for \"0d27cf9f54d5f588debedbd082f9af77ca1472e76fccfc61301c2f8c491e545b\"" Jan 29 11:23:11.945853 containerd[1478]: time="2025-01-29T11:23:11.944755245Z" level=info msg="StartContainer for \"6963e5073f3e84da0aa096f485258f6318a2ed281f145ff686c1b1cedf8e072d\"" Jan 29 11:23:11.979273 systemd[1]: Started cri-containerd-0d27cf9f54d5f588debedbd082f9af77ca1472e76fccfc61301c2f8c491e545b.scope - libcontainer container 0d27cf9f54d5f588debedbd082f9af77ca1472e76fccfc61301c2f8c491e545b. Jan 29 11:23:11.986175 systemd[1]: Started cri-containerd-6963e5073f3e84da0aa096f485258f6318a2ed281f145ff686c1b1cedf8e072d.scope - libcontainer container 6963e5073f3e84da0aa096f485258f6318a2ed281f145ff686c1b1cedf8e072d. Jan 29 11:23:11.997996 systemd[1]: Started cri-containerd-1219c5a8ac78f5c7e76c91096c43c1b3c8380eeda4e645e6a369487b061f2ea0.scope - libcontainer container 1219c5a8ac78f5c7e76c91096c43c1b3c8380eeda4e645e6a369487b061f2ea0. Jan 29 11:23:12.060004 containerd[1478]: time="2025-01-29T11:23:12.059958884Z" level=info msg="StartContainer for \"0d27cf9f54d5f588debedbd082f9af77ca1472e76fccfc61301c2f8c491e545b\" returns successfully" Jan 29 11:23:12.064927 containerd[1478]: time="2025-01-29T11:23:12.064780557Z" level=info msg="StartContainer for \"6963e5073f3e84da0aa096f485258f6318a2ed281f145ff686c1b1cedf8e072d\" returns successfully" Jan 29 11:23:12.069754 containerd[1478]: time="2025-01-29T11:23:12.069704991Z" level=info msg="StartContainer for \"1219c5a8ac78f5c7e76c91096c43c1b3c8380eeda4e645e6a369487b061f2ea0\" returns successfully" Jan 29 11:23:15.176835 kubelet[2815]: E0129 11:23:15.176391 2815 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:50836->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4152-2-0-b-6e231d00a9.181f260523b1bd3d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4152-2-0-b-6e231d00a9,UID:31f5e2f665aa44c47678d1c5dcb56330,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4152-2-0-b-6e231d00a9,},FirstTimestamp:2025-01-29 11:23:04.703843645 +0000 UTC m=+1044.152199791,LastTimestamp:2025-01-29 11:23:04.703843645 +0000 UTC m=+1044.152199791,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4152-2-0-b-6e231d00a9,}"