Oct 9 02:43:56.882508 kernel: Linux version 6.6.54-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Tue Oct 8 23:33:43 -00 2024 Oct 9 02:43:56.882530 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ecc53326196a1bacd9ba781ce772ef34cdd5fe5561cf830307501ec3d5ba168a Oct 9 02:43:56.882538 kernel: BIOS-provided physical RAM map: Oct 9 02:43:56.882544 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Oct 9 02:43:56.882549 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Oct 9 02:43:56.882554 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Oct 9 02:43:56.882559 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Oct 9 02:43:56.882565 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Oct 9 02:43:56.882572 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Oct 9 02:43:56.882577 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Oct 9 02:43:56.882582 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Oct 9 02:43:56.882587 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Oct 9 02:43:56.882592 kernel: NX (Execute Disable) protection: active Oct 9 02:43:56.882621 kernel: APIC: Static calls initialized Oct 9 02:43:56.882630 kernel: SMBIOS 2.8 present. Oct 9 02:43:56.882636 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Oct 9 02:43:56.882642 kernel: Hypervisor detected: KVM Oct 9 02:43:56.882647 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Oct 9 02:43:56.882653 kernel: kvm-clock: using sched offset of 2857404133 cycles Oct 9 02:43:56.882659 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Oct 9 02:43:56.882664 kernel: tsc: Detected 2445.404 MHz processor Oct 9 02:43:56.882670 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Oct 9 02:43:56.882676 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Oct 9 02:43:56.882684 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Oct 9 02:43:56.882689 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Oct 9 02:43:56.882695 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 9 02:43:56.882700 kernel: Using GB pages for direct mapping Oct 9 02:43:56.882706 kernel: ACPI: Early table checksum verification disabled Oct 9 02:43:56.882711 kernel: ACPI: RSDP 0x00000000000F51F0 000014 (v00 BOCHS ) Oct 9 02:43:56.882717 kernel: ACPI: RSDT 0x000000007CFE265D 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882723 kernel: ACPI: FACP 0x000000007CFE244D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882728 kernel: ACPI: DSDT 0x000000007CFE0040 00240D (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882736 kernel: ACPI: FACS 0x000000007CFE0000 000040 Oct 9 02:43:56.882741 kernel: ACPI: APIC 0x000000007CFE2541 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882747 kernel: ACPI: HPET 0x000000007CFE25C1 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882752 kernel: ACPI: MCFG 0x000000007CFE25F9 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882758 kernel: ACPI: WAET 0x000000007CFE2635 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Oct 9 02:43:56.882764 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe244d-0x7cfe2540] Oct 9 02:43:56.882769 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe244c] Oct 9 02:43:56.882775 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Oct 9 02:43:56.882785 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2541-0x7cfe25c0] Oct 9 02:43:56.882791 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25c1-0x7cfe25f8] Oct 9 02:43:56.882797 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe25f9-0x7cfe2634] Oct 9 02:43:56.882803 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe2635-0x7cfe265c] Oct 9 02:43:56.882809 kernel: No NUMA configuration found Oct 9 02:43:56.882815 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Oct 9 02:43:56.882823 kernel: NODE_DATA(0) allocated [mem 0x7cfd6000-0x7cfdbfff] Oct 9 02:43:56.882828 kernel: Zone ranges: Oct 9 02:43:56.882834 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 9 02:43:56.882840 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Oct 9 02:43:56.882846 kernel: Normal empty Oct 9 02:43:56.882852 kernel: Movable zone start for each node Oct 9 02:43:56.882858 kernel: Early memory node ranges Oct 9 02:43:56.882863 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Oct 9 02:43:56.882869 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Oct 9 02:43:56.882875 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Oct 9 02:43:56.882883 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 9 02:43:56.882889 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Oct 9 02:43:56.882894 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Oct 9 02:43:56.882900 kernel: ACPI: PM-Timer IO Port: 0x608 Oct 9 02:43:56.882906 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Oct 9 02:43:56.882912 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Oct 9 02:43:56.882917 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Oct 9 02:43:56.882923 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Oct 9 02:43:56.882929 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Oct 9 02:43:56.882937 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Oct 9 02:43:56.882943 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Oct 9 02:43:56.882948 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 9 02:43:56.882954 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Oct 9 02:43:56.882960 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Oct 9 02:43:56.882966 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Oct 9 02:43:56.882972 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Oct 9 02:43:56.882977 kernel: Booting paravirtualized kernel on KVM Oct 9 02:43:56.882983 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 9 02:43:56.882991 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Oct 9 02:43:56.882997 kernel: percpu: Embedded 58 pages/cpu s196904 r8192 d32472 u1048576 Oct 9 02:43:56.883003 kernel: pcpu-alloc: s196904 r8192 d32472 u1048576 alloc=1*2097152 Oct 9 02:43:56.883009 kernel: pcpu-alloc: [0] 0 1 Oct 9 02:43:56.883014 kernel: kvm-guest: PV spinlocks disabled, no host support Oct 9 02:43:56.883021 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ecc53326196a1bacd9ba781ce772ef34cdd5fe5561cf830307501ec3d5ba168a Oct 9 02:43:56.883027 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Oct 9 02:43:56.883033 kernel: random: crng init done Oct 9 02:43:56.883041 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 9 02:43:56.883046 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Oct 9 02:43:56.883052 kernel: Fallback order for Node 0: 0 Oct 9 02:43:56.883058 kernel: Built 1 zonelists, mobility grouping on. Total pages: 503708 Oct 9 02:43:56.883064 kernel: Policy zone: DMA32 Oct 9 02:43:56.883070 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 9 02:43:56.883093 kernel: Memory: 1922056K/2047464K available (12288K kernel code, 2305K rwdata, 22728K rodata, 42872K init, 2316K bss, 125148K reserved, 0K cma-reserved) Oct 9 02:43:56.883104 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Oct 9 02:43:56.883122 kernel: ftrace: allocating 37786 entries in 148 pages Oct 9 02:43:56.883138 kernel: ftrace: allocated 148 pages with 3 groups Oct 9 02:43:56.883149 kernel: Dynamic Preempt: voluntary Oct 9 02:43:56.883160 kernel: rcu: Preemptible hierarchical RCU implementation. Oct 9 02:43:56.883168 kernel: rcu: RCU event tracing is enabled. Oct 9 02:43:56.883174 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Oct 9 02:43:56.883181 kernel: Trampoline variant of Tasks RCU enabled. Oct 9 02:43:56.883187 kernel: Rude variant of Tasks RCU enabled. Oct 9 02:43:56.883193 kernel: Tracing variant of Tasks RCU enabled. Oct 9 02:43:56.883199 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 9 02:43:56.883207 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Oct 9 02:43:56.883213 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Oct 9 02:43:56.883219 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Oct 9 02:43:56.883225 kernel: Console: colour VGA+ 80x25 Oct 9 02:43:56.883231 kernel: printk: console [tty0] enabled Oct 9 02:43:56.883236 kernel: printk: console [ttyS0] enabled Oct 9 02:43:56.883242 kernel: ACPI: Core revision 20230628 Oct 9 02:43:56.883248 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Oct 9 02:43:56.883254 kernel: APIC: Switch to symmetric I/O mode setup Oct 9 02:43:56.883262 kernel: x2apic enabled Oct 9 02:43:56.883268 kernel: APIC: Switched APIC routing to: physical x2apic Oct 9 02:43:56.883273 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Oct 9 02:43:56.883279 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Oct 9 02:43:56.883285 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) Oct 9 02:43:56.883291 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Oct 9 02:43:56.883297 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Oct 9 02:43:56.883302 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Oct 9 02:43:56.883308 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 9 02:43:56.883322 kernel: Spectre V2 : Mitigation: Retpolines Oct 9 02:43:56.883329 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Oct 9 02:43:56.883335 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Oct 9 02:43:56.883343 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Oct 9 02:43:56.883349 kernel: RETBleed: Mitigation: untrained return thunk Oct 9 02:43:56.883355 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 9 02:43:56.883361 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 9 02:43:56.883367 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Oct 9 02:43:56.883374 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Oct 9 02:43:56.883380 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Oct 9 02:43:56.883387 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 9 02:43:56.883395 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 9 02:43:56.883401 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 9 02:43:56.883407 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 9 02:43:56.883413 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Oct 9 02:43:56.883419 kernel: Freeing SMP alternatives memory: 32K Oct 9 02:43:56.883427 kernel: pid_max: default: 32768 minimum: 301 Oct 9 02:43:56.883433 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Oct 9 02:43:56.883439 kernel: landlock: Up and running. Oct 9 02:43:56.883445 kernel: SELinux: Initializing. Oct 9 02:43:56.883452 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 9 02:43:56.883458 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 9 02:43:56.883464 kernel: smpboot: CPU0: AMD EPYC Processor (family: 0x17, model: 0x31, stepping: 0x0) Oct 9 02:43:56.883470 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Oct 9 02:43:56.883476 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Oct 9 02:43:56.883484 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Oct 9 02:43:56.883490 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Oct 9 02:43:56.883496 kernel: ... version: 0 Oct 9 02:43:56.883502 kernel: ... bit width: 48 Oct 9 02:43:56.883508 kernel: ... generic registers: 6 Oct 9 02:43:56.883514 kernel: ... value mask: 0000ffffffffffff Oct 9 02:43:56.883520 kernel: ... max period: 00007fffffffffff Oct 9 02:43:56.883526 kernel: ... fixed-purpose events: 0 Oct 9 02:43:56.883533 kernel: ... event mask: 000000000000003f Oct 9 02:43:56.883541 kernel: signal: max sigframe size: 1776 Oct 9 02:43:56.883547 kernel: rcu: Hierarchical SRCU implementation. Oct 9 02:43:56.883553 kernel: rcu: Max phase no-delay instances is 400. Oct 9 02:43:56.883559 kernel: smp: Bringing up secondary CPUs ... Oct 9 02:43:56.883565 kernel: smpboot: x86: Booting SMP configuration: Oct 9 02:43:56.883571 kernel: .... node #0, CPUs: #1 Oct 9 02:43:56.883577 kernel: smp: Brought up 1 node, 2 CPUs Oct 9 02:43:56.883583 kernel: smpboot: Max logical packages: 1 Oct 9 02:43:56.883590 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) Oct 9 02:43:56.883611 kernel: devtmpfs: initialized Oct 9 02:43:56.883617 kernel: x86/mm: Memory block size: 128MB Oct 9 02:43:56.883624 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 9 02:43:56.883630 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Oct 9 02:43:56.883636 kernel: pinctrl core: initialized pinctrl subsystem Oct 9 02:43:56.883642 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 9 02:43:56.883648 kernel: audit: initializing netlink subsys (disabled) Oct 9 02:43:56.883654 kernel: audit: type=2000 audit(1728441836.171:1): state=initialized audit_enabled=0 res=1 Oct 9 02:43:56.883660 kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 9 02:43:56.883669 kernel: thermal_sys: Registered thermal governor 'user_space' Oct 9 02:43:56.883675 kernel: cpuidle: using governor menu Oct 9 02:43:56.883681 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 9 02:43:56.883687 kernel: dca service started, version 1.12.1 Oct 9 02:43:56.883693 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Oct 9 02:43:56.883699 kernel: PCI: Using configuration type 1 for base access Oct 9 02:43:56.883705 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 9 02:43:56.883711 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Oct 9 02:43:56.883717 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Oct 9 02:43:56.883726 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Oct 9 02:43:56.883732 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Oct 9 02:43:56.883738 kernel: ACPI: Added _OSI(Module Device) Oct 9 02:43:56.883744 kernel: ACPI: Added _OSI(Processor Device) Oct 9 02:43:56.883750 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Oct 9 02:43:56.883756 kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 9 02:43:56.883762 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Oct 9 02:43:56.883768 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Oct 9 02:43:56.883774 kernel: ACPI: Interpreter enabled Oct 9 02:43:56.883782 kernel: ACPI: PM: (supports S0 S5) Oct 9 02:43:56.883788 kernel: ACPI: Using IOAPIC for interrupt routing Oct 9 02:43:56.883794 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 9 02:43:56.883801 kernel: PCI: Using E820 reservations for host bridge windows Oct 9 02:43:56.883807 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Oct 9 02:43:56.883813 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Oct 9 02:43:56.883973 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Oct 9 02:43:56.884087 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Oct 9 02:43:56.884197 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Oct 9 02:43:56.884206 kernel: PCI host bridge to bus 0000:00 Oct 9 02:43:56.884319 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 9 02:43:56.884418 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Oct 9 02:43:56.884513 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 9 02:43:56.884631 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Oct 9 02:43:56.884732 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Oct 9 02:43:56.884831 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Oct 9 02:43:56.884925 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Oct 9 02:43:56.885046 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Oct 9 02:43:56.885160 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Oct 9 02:43:56.885265 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfb800000-0xfbffffff pref] Oct 9 02:43:56.885368 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfd200000-0xfd203fff 64bit pref] Oct 9 02:43:56.885476 kernel: pci 0000:00:01.0: reg 0x20: [mem 0xfea10000-0xfea10fff] Oct 9 02:43:56.885578 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea00000-0xfea0ffff pref] Oct 9 02:43:56.886022 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 9 02:43:56.886159 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.886271 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea11000-0xfea11fff] Oct 9 02:43:56.886386 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.886493 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea12000-0xfea12fff] Oct 9 02:43:56.886629 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.886741 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea13000-0xfea13fff] Oct 9 02:43:56.886854 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.886959 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea14000-0xfea14fff] Oct 9 02:43:56.887071 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.887193 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea15000-0xfea15fff] Oct 9 02:43:56.887315 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.887420 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea16000-0xfea16fff] Oct 9 02:43:56.887531 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.887702 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea17000-0xfea17fff] Oct 9 02:43:56.887823 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.887929 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea18000-0xfea18fff] Oct 9 02:43:56.888047 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Oct 9 02:43:56.888152 kernel: pci 0000:00:03.0: reg 0x10: [mem 0xfea19000-0xfea19fff] Oct 9 02:43:56.888337 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Oct 9 02:43:56.888445 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Oct 9 02:43:56.888556 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Oct 9 02:43:56.888693 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc040-0xc05f] Oct 9 02:43:56.888805 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea1a000-0xfea1afff] Oct 9 02:43:56.888918 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Oct 9 02:43:56.889022 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Oct 9 02:43:56.889139 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Oct 9 02:43:56.889344 kernel: pci 0000:01:00.0: reg 0x14: [mem 0xfe880000-0xfe880fff] Oct 9 02:43:56.889777 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Oct 9 02:43:56.889919 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfe800000-0xfe87ffff pref] Oct 9 02:43:56.890034 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Oct 9 02:43:56.890138 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Oct 9 02:43:56.890242 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Oct 9 02:43:56.890362 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Oct 9 02:43:56.890471 kernel: pci 0000:02:00.0: reg 0x10: [mem 0xfe600000-0xfe603fff 64bit] Oct 9 02:43:56.890575 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Oct 9 02:43:56.890701 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Oct 9 02:43:56.890807 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Oct 9 02:43:56.890965 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Oct 9 02:43:56.891503 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfe400000-0xfe400fff] Oct 9 02:43:56.893801 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xfcc00000-0xfcc03fff 64bit pref] Oct 9 02:43:56.893922 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Oct 9 02:43:56.894051 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Oct 9 02:43:56.894158 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Oct 9 02:43:56.894508 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Oct 9 02:43:56.894722 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Oct 9 02:43:56.894832 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Oct 9 02:43:56.894937 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Oct 9 02:43:56.895041 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Oct 9 02:43:56.895187 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Oct 9 02:43:56.895300 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xfc800000-0xfc803fff 64bit pref] Oct 9 02:43:56.895411 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Oct 9 02:43:56.895515 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Oct 9 02:43:56.897403 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Oct 9 02:43:56.897536 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Oct 9 02:43:56.898168 kernel: pci 0000:06:00.0: reg 0x14: [mem 0xfde00000-0xfde00fff] Oct 9 02:43:56.898286 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xfc600000-0xfc603fff 64bit pref] Oct 9 02:43:56.898392 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Oct 9 02:43:56.898503 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Oct 9 02:43:56.899201 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Oct 9 02:43:56.899215 kernel: acpiphp: Slot [0] registered Oct 9 02:43:56.899343 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Oct 9 02:43:56.899455 kernel: pci 0000:07:00.0: reg 0x14: [mem 0xfdc80000-0xfdc80fff] Oct 9 02:43:56.899564 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xfc400000-0xfc403fff 64bit pref] Oct 9 02:43:56.901717 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfdc00000-0xfdc7ffff pref] Oct 9 02:43:56.901833 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Oct 9 02:43:56.901946 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Oct 9 02:43:56.902050 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Oct 9 02:43:56.902059 kernel: acpiphp: Slot [0-2] registered Oct 9 02:43:56.902165 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Oct 9 02:43:56.902271 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Oct 9 02:43:56.902375 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Oct 9 02:43:56.902384 kernel: acpiphp: Slot [0-3] registered Oct 9 02:43:56.902487 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Oct 9 02:43:56.902595 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Oct 9 02:43:56.902994 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Oct 9 02:43:56.903004 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Oct 9 02:43:56.903011 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Oct 9 02:43:56.903018 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Oct 9 02:43:56.903024 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Oct 9 02:43:56.903031 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Oct 9 02:43:56.903037 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Oct 9 02:43:56.903043 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Oct 9 02:43:56.903054 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Oct 9 02:43:56.903060 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Oct 9 02:43:56.903066 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Oct 9 02:43:56.903072 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Oct 9 02:43:56.903103 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Oct 9 02:43:56.903110 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Oct 9 02:43:56.903116 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Oct 9 02:43:56.903123 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Oct 9 02:43:56.903129 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Oct 9 02:43:56.903138 kernel: iommu: Default domain type: Translated Oct 9 02:43:56.903144 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 9 02:43:56.903151 kernel: PCI: Using ACPI for IRQ routing Oct 9 02:43:56.903157 kernel: PCI: pci_cache_line_size set to 64 bytes Oct 9 02:43:56.903163 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Oct 9 02:43:56.903169 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Oct 9 02:43:56.903281 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Oct 9 02:43:56.903426 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Oct 9 02:43:56.903544 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 9 02:43:56.903554 kernel: vgaarb: loaded Oct 9 02:43:56.903560 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Oct 9 02:43:56.903567 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Oct 9 02:43:56.903573 kernel: clocksource: Switched to clocksource kvm-clock Oct 9 02:43:56.903579 kernel: VFS: Disk quotas dquot_6.6.0 Oct 9 02:43:56.903586 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 9 02:43:56.903592 kernel: pnp: PnP ACPI init Oct 9 02:43:56.903765 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Oct 9 02:43:56.903781 kernel: pnp: PnP ACPI: found 5 devices Oct 9 02:43:56.903788 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 9 02:43:56.903794 kernel: NET: Registered PF_INET protocol family Oct 9 02:43:56.903800 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Oct 9 02:43:56.903807 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Oct 9 02:43:56.903813 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 9 02:43:56.903820 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Oct 9 02:43:56.903826 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 9 02:43:56.903835 kernel: TCP: Hash tables configured (established 16384 bind 16384) Oct 9 02:43:56.903841 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 9 02:43:56.903847 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 9 02:43:56.903853 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 9 02:43:56.903860 kernel: NET: Registered PF_XDP protocol family Oct 9 02:43:56.903964 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Oct 9 02:43:56.904098 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Oct 9 02:43:56.904229 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Oct 9 02:43:56.904341 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Oct 9 02:43:56.904446 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Oct 9 02:43:56.904550 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Oct 9 02:43:56.905771 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Oct 9 02:43:56.905908 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Oct 9 02:43:56.906020 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Oct 9 02:43:56.906123 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Oct 9 02:43:56.906227 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Oct 9 02:43:56.906335 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Oct 9 02:43:56.906437 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Oct 9 02:43:56.906540 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Oct 9 02:43:56.907494 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Oct 9 02:43:56.907702 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Oct 9 02:43:56.907816 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Oct 9 02:43:56.907921 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Oct 9 02:43:56.908028 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Oct 9 02:43:56.908149 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Oct 9 02:43:56.908252 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Oct 9 02:43:56.908354 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Oct 9 02:43:56.908455 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Oct 9 02:43:56.908557 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Oct 9 02:43:56.908692 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Oct 9 02:43:56.908799 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Oct 9 02:43:56.908901 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Oct 9 02:43:56.909005 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Oct 9 02:43:56.909113 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Oct 9 02:43:56.909215 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Oct 9 02:43:56.909317 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Oct 9 02:43:56.909419 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Oct 9 02:43:56.909524 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Oct 9 02:43:56.909670 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Oct 9 02:43:56.909776 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Oct 9 02:43:56.909884 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Oct 9 02:43:56.909981 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Oct 9 02:43:56.910076 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Oct 9 02:43:56.910173 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Oct 9 02:43:56.910267 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Oct 9 02:43:56.910360 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Oct 9 02:43:56.910452 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Oct 9 02:43:56.910564 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Oct 9 02:43:56.914714 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Oct 9 02:43:56.914835 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Oct 9 02:43:56.914944 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Oct 9 02:43:56.915051 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Oct 9 02:43:56.915177 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Oct 9 02:43:56.915285 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Oct 9 02:43:56.915384 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Oct 9 02:43:56.915489 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Oct 9 02:43:56.915592 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Oct 9 02:43:56.915720 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Oct 9 02:43:56.915821 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Oct 9 02:43:56.915927 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Oct 9 02:43:56.916027 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Oct 9 02:43:56.916125 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Oct 9 02:43:56.916230 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Oct 9 02:43:56.916335 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Oct 9 02:43:56.916433 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Oct 9 02:43:56.916557 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Oct 9 02:43:56.917755 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Oct 9 02:43:56.917863 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Oct 9 02:43:56.917873 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Oct 9 02:43:56.917880 kernel: PCI: CLS 0 bytes, default 64 Oct 9 02:43:56.917891 kernel: Initialise system trusted keyrings Oct 9 02:43:56.917898 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Oct 9 02:43:56.917905 kernel: Key type asymmetric registered Oct 9 02:43:56.917911 kernel: Asymmetric key parser 'x509' registered Oct 9 02:43:56.917917 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Oct 9 02:43:56.917925 kernel: io scheduler mq-deadline registered Oct 9 02:43:56.917931 kernel: io scheduler kyber registered Oct 9 02:43:56.917938 kernel: io scheduler bfq registered Oct 9 02:43:56.918044 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Oct 9 02:43:56.918156 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Oct 9 02:43:56.919700 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Oct 9 02:43:56.919815 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Oct 9 02:43:56.919920 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Oct 9 02:43:56.920024 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Oct 9 02:43:56.920132 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Oct 9 02:43:56.920237 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Oct 9 02:43:56.920342 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Oct 9 02:43:56.920452 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Oct 9 02:43:56.920555 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Oct 9 02:43:56.920844 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Oct 9 02:43:56.921014 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Oct 9 02:43:56.921121 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Oct 9 02:43:56.921226 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Oct 9 02:43:56.921330 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Oct 9 02:43:56.921340 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Oct 9 02:43:56.921442 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Oct 9 02:43:56.921550 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Oct 9 02:43:56.921560 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Oct 9 02:43:56.921567 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Oct 9 02:43:56.921574 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 9 02:43:56.921580 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 9 02:43:56.921587 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Oct 9 02:43:56.921594 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Oct 9 02:43:56.921638 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Oct 9 02:43:56.921648 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Oct 9 02:43:56.921767 kernel: rtc_cmos 00:03: RTC can wake from S4 Oct 9 02:43:56.921868 kernel: rtc_cmos 00:03: registered as rtc0 Oct 9 02:43:56.921964 kernel: rtc_cmos 00:03: setting system clock to 2024-10-09T02:43:56 UTC (1728441836) Oct 9 02:43:56.922060 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Oct 9 02:43:56.922069 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Oct 9 02:43:56.922075 kernel: NET: Registered PF_INET6 protocol family Oct 9 02:43:56.922082 kernel: Segment Routing with IPv6 Oct 9 02:43:56.922092 kernel: In-situ OAM (IOAM) with IPv6 Oct 9 02:43:56.922099 kernel: NET: Registered PF_PACKET protocol family Oct 9 02:43:56.922105 kernel: Key type dns_resolver registered Oct 9 02:43:56.922112 kernel: IPI shorthand broadcast: enabled Oct 9 02:43:56.922119 kernel: sched_clock: Marking stable (1073009322, 131304010)->(1213630957, -9317625) Oct 9 02:43:56.922126 kernel: registered taskstats version 1 Oct 9 02:43:56.922132 kernel: Loading compiled-in X.509 certificates Oct 9 02:43:56.922139 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.54-flatcar: 03ae66f5ce294ce3ab718ee0d7c4a4a6e8c5aae6' Oct 9 02:43:56.922145 kernel: Key type .fscrypt registered Oct 9 02:43:56.922154 kernel: Key type fscrypt-provisioning registered Oct 9 02:43:56.922161 kernel: ima: No TPM chip found, activating TPM-bypass! Oct 9 02:43:56.922167 kernel: ima: Allocated hash algorithm: sha1 Oct 9 02:43:56.922174 kernel: ima: No architecture policies found Oct 9 02:43:56.922180 kernel: clk: Disabling unused clocks Oct 9 02:43:56.922187 kernel: Freeing unused kernel image (initmem) memory: 42872K Oct 9 02:43:56.922194 kernel: Write protecting the kernel read-only data: 36864k Oct 9 02:43:56.922200 kernel: Freeing unused kernel image (rodata/data gap) memory: 1848K Oct 9 02:43:56.922207 kernel: Run /init as init process Oct 9 02:43:56.922216 kernel: with arguments: Oct 9 02:43:56.922222 kernel: /init Oct 9 02:43:56.922229 kernel: with environment: Oct 9 02:43:56.922235 kernel: HOME=/ Oct 9 02:43:56.922241 kernel: TERM=linux Oct 9 02:43:56.922248 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Oct 9 02:43:56.922256 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Oct 9 02:43:56.922265 systemd[1]: Detected virtualization kvm. Oct 9 02:43:56.922274 systemd[1]: Detected architecture x86-64. Oct 9 02:43:56.922281 systemd[1]: Running in initrd. Oct 9 02:43:56.922287 systemd[1]: No hostname configured, using default hostname. Oct 9 02:43:56.922294 systemd[1]: Hostname set to . Oct 9 02:43:56.922301 systemd[1]: Initializing machine ID from VM UUID. Oct 9 02:43:56.922308 systemd[1]: Queued start job for default target initrd.target. Oct 9 02:43:56.922315 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 9 02:43:56.922322 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 9 02:43:56.922332 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Oct 9 02:43:56.922340 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 9 02:43:56.922352 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Oct 9 02:43:56.922365 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Oct 9 02:43:56.922375 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Oct 9 02:43:56.922382 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Oct 9 02:43:56.922389 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 9 02:43:56.922398 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 9 02:43:56.922405 systemd[1]: Reached target paths.target - Path Units. Oct 9 02:43:56.922412 systemd[1]: Reached target slices.target - Slice Units. Oct 9 02:43:56.922419 systemd[1]: Reached target swap.target - Swaps. Oct 9 02:43:56.922426 systemd[1]: Reached target timers.target - Timer Units. Oct 9 02:43:56.922433 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Oct 9 02:43:56.922440 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 9 02:43:56.922447 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Oct 9 02:43:56.922455 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Oct 9 02:43:56.922462 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 9 02:43:56.922469 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 9 02:43:56.922476 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 9 02:43:56.922483 systemd[1]: Reached target sockets.target - Socket Units. Oct 9 02:43:56.922494 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Oct 9 02:43:56.922507 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 9 02:43:56.922519 systemd[1]: Finished network-cleanup.service - Network Cleanup. Oct 9 02:43:56.922531 systemd[1]: Starting systemd-fsck-usr.service... Oct 9 02:43:56.922548 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 9 02:43:56.922557 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 9 02:43:56.922564 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 9 02:43:56.922590 systemd-journald[187]: Collecting audit messages is disabled. Oct 9 02:43:56.922648 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Oct 9 02:43:56.922655 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 9 02:43:56.922662 systemd[1]: Finished systemd-fsck-usr.service. Oct 9 02:43:56.922670 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 9 02:43:56.922680 systemd-journald[187]: Journal started Oct 9 02:43:56.922695 systemd-journald[187]: Runtime Journal (/run/log/journal/5c7e26b957e0431096ade91dd3782794) is 4.8M, max 38.4M, 33.6M free. Oct 9 02:43:56.889900 systemd-modules-load[188]: Inserted module 'overlay' Oct 9 02:43:56.958520 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 9 02:43:56.958546 kernel: Bridge firewalling registered Oct 9 02:43:56.958556 systemd[1]: Started systemd-journald.service - Journal Service. Oct 9 02:43:56.927262 systemd-modules-load[188]: Inserted module 'br_netfilter' Oct 9 02:43:56.959930 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 9 02:43:56.960594 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:43:56.961626 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 9 02:43:56.969721 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 9 02:43:56.971697 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 9 02:43:56.975371 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 9 02:43:56.977786 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 9 02:43:56.988527 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 9 02:43:56.996924 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 9 02:43:56.999943 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 9 02:43:57.005814 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Oct 9 02:43:57.007450 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 9 02:43:57.012770 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 9 02:43:57.016650 dracut-cmdline[220]: dracut-dracut-053 Oct 9 02:43:57.017270 dracut-cmdline[220]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ecc53326196a1bacd9ba781ce772ef34cdd5fe5561cf830307501ec3d5ba168a Oct 9 02:43:57.050363 systemd-resolved[225]: Positive Trust Anchors: Oct 9 02:43:57.050382 systemd-resolved[225]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 9 02:43:57.050408 systemd-resolved[225]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 9 02:43:57.057661 systemd-resolved[225]: Defaulting to hostname 'linux'. Oct 9 02:43:57.058880 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 9 02:43:57.059531 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 9 02:43:57.084636 kernel: SCSI subsystem initialized Oct 9 02:43:57.093652 kernel: Loading iSCSI transport class v2.0-870. Oct 9 02:43:57.103641 kernel: iscsi: registered transport (tcp) Oct 9 02:43:57.122709 kernel: iscsi: registered transport (qla4xxx) Oct 9 02:43:57.122768 kernel: QLogic iSCSI HBA Driver Oct 9 02:43:57.168587 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Oct 9 02:43:57.173788 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Oct 9 02:43:57.198629 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 9 02:43:57.198682 kernel: device-mapper: uevent: version 1.0.3 Oct 9 02:43:57.198695 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Oct 9 02:43:57.239633 kernel: raid6: avx2x4 gen() 30612 MB/s Oct 9 02:43:57.256625 kernel: raid6: avx2x2 gen() 29600 MB/s Oct 9 02:43:57.273756 kernel: raid6: avx2x1 gen() 23707 MB/s Oct 9 02:43:57.273787 kernel: raid6: using algorithm avx2x4 gen() 30612 MB/s Oct 9 02:43:57.291880 kernel: raid6: .... xor() 4564 MB/s, rmw enabled Oct 9 02:43:57.291931 kernel: raid6: using avx2x2 recovery algorithm Oct 9 02:43:57.310632 kernel: xor: automatically using best checksumming function avx Oct 9 02:43:57.436647 kernel: Btrfs loaded, zoned=no, fsverity=no Oct 9 02:43:57.450392 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Oct 9 02:43:57.459824 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 9 02:43:57.471172 systemd-udevd[405]: Using default interface naming scheme 'v255'. Oct 9 02:43:57.474997 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 9 02:43:57.483765 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Oct 9 02:43:57.496439 dracut-pre-trigger[410]: rd.md=0: removing MD RAID activation Oct 9 02:43:57.526069 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Oct 9 02:43:57.531745 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 9 02:43:57.607303 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 9 02:43:57.614989 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Oct 9 02:43:57.632871 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Oct 9 02:43:57.636519 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Oct 9 02:43:57.637686 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 9 02:43:57.639504 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 9 02:43:57.646781 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Oct 9 02:43:57.661146 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Oct 9 02:43:57.742667 kernel: scsi host0: Virtio SCSI HBA Oct 9 02:43:57.742751 kernel: cryptd: max_cpu_qlen set to 1000 Oct 9 02:43:57.755845 kernel: AVX2 version of gcm_enc/dec engaged. Oct 9 02:43:57.755903 kernel: AES CTR mode by8 optimization enabled Oct 9 02:43:57.755913 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Oct 9 02:43:57.772268 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 9 02:43:57.775297 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 9 02:43:57.777443 kernel: ACPI: bus type USB registered Oct 9 02:43:57.779020 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 9 02:43:57.780580 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 9 02:43:57.786280 kernel: usbcore: registered new interface driver usbfs Oct 9 02:43:57.786307 kernel: usbcore: registered new interface driver hub Oct 9 02:43:57.780884 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:43:57.789663 kernel: usbcore: registered new device driver usb Oct 9 02:43:57.782494 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Oct 9 02:43:57.792999 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 9 02:43:57.824632 kernel: libata version 3.00 loaded. Oct 9 02:43:57.843893 kernel: ahci 0000:00:1f.2: version 3.0 Oct 9 02:43:57.844116 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Oct 9 02:43:57.844130 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Oct 9 02:43:57.844260 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Oct 9 02:43:57.846709 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Oct 9 02:43:57.846886 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Oct 9 02:43:57.847649 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Oct 9 02:43:57.849617 kernel: scsi host1: ahci Oct 9 02:43:57.849780 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Oct 9 02:43:57.849913 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Oct 9 02:43:57.850040 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Oct 9 02:43:57.850629 kernel: hub 1-0:1.0: USB hub found Oct 9 02:43:57.850807 kernel: hub 1-0:1.0: 4 ports detected Oct 9 02:43:57.851638 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Oct 9 02:43:57.851810 kernel: hub 2-0:1.0: USB hub found Oct 9 02:43:57.851951 kernel: hub 2-0:1.0: 4 ports detected Oct 9 02:43:57.852628 kernel: scsi host2: ahci Oct 9 02:43:57.854617 kernel: scsi host3: ahci Oct 9 02:43:57.854792 kernel: scsi host4: ahci Oct 9 02:43:57.856637 kernel: scsi host5: ahci Oct 9 02:43:57.866978 kernel: scsi host6: ahci Oct 9 02:43:57.867185 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 Oct 9 02:43:57.867196 kernel: sd 0:0:0:0: Power-on or device reset occurred Oct 9 02:43:57.867357 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 Oct 9 02:43:57.867366 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Oct 9 02:43:57.867501 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 Oct 9 02:43:57.867510 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 Oct 9 02:43:57.867525 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 Oct 9 02:43:57.867533 kernel: sd 0:0:0:0: [sda] Write Protect is off Oct 9 02:43:57.867686 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 Oct 9 02:43:57.888624 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Oct 9 02:43:57.889579 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:43:57.900025 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Oct 9 02:43:57.900201 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Oct 9 02:43:57.900220 kernel: GPT:17805311 != 80003071 Oct 9 02:43:57.900229 kernel: GPT:Alternate GPT header not at the end of the disk. Oct 9 02:43:57.900237 kernel: GPT:17805311 != 80003071 Oct 9 02:43:57.900245 kernel: GPT: Use GNU Parted to correct GPT errors. Oct 9 02:43:57.900254 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 9 02:43:57.900262 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Oct 9 02:43:57.904798 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 9 02:43:57.918045 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 9 02:43:58.087633 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Oct 9 02:43:58.185628 kernel: ata3: SATA link down (SStatus 0 SControl 300) Oct 9 02:43:58.185701 kernel: ata2: SATA link down (SStatus 0 SControl 300) Oct 9 02:43:58.185714 kernel: ata5: SATA link down (SStatus 0 SControl 300) Oct 9 02:43:58.185737 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Oct 9 02:43:58.185748 kernel: ata6: SATA link down (SStatus 0 SControl 300) Oct 9 02:43:58.185758 kernel: ata4: SATA link down (SStatus 0 SControl 300) Oct 9 02:43:58.187643 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Oct 9 02:43:58.193496 kernel: ata1.00: applying bridge limits Oct 9 02:43:58.195042 kernel: ata1.00: configured for UDMA/100 Oct 9 02:43:58.196782 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Oct 9 02:43:58.229627 kernel: hid: raw HID events driver (C) Jiri Kosina Oct 9 02:43:58.237634 kernel: usbcore: registered new interface driver usbhid Oct 9 02:43:58.237696 kernel: usbhid: USB HID core driver Oct 9 02:43:58.246516 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Oct 9 02:43:58.246555 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Oct 9 02:43:58.248426 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Oct 9 02:43:58.248661 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Oct 9 02:43:58.265640 kernel: BTRFS: device fsid 6ed52ce5-b2f8-4d16-8889-677a209bc377 devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (451) Oct 9 02:43:58.267643 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 scanned by (udev-worker) (456) Oct 9 02:43:58.267680 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Oct 9 02:43:58.278534 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Oct 9 02:43:58.285847 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Oct 9 02:43:58.290346 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Oct 9 02:43:58.295235 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Oct 9 02:43:58.295823 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Oct 9 02:43:58.301816 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Oct 9 02:43:58.308397 disk-uuid[575]: Primary Header is updated. Oct 9 02:43:58.308397 disk-uuid[575]: Secondary Entries is updated. Oct 9 02:43:58.308397 disk-uuid[575]: Secondary Header is updated. Oct 9 02:43:58.313640 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 9 02:43:58.319635 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 9 02:43:59.327639 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 9 02:43:59.328329 disk-uuid[576]: The operation has completed successfully. Oct 9 02:43:59.385585 systemd[1]: disk-uuid.service: Deactivated successfully. Oct 9 02:43:59.385730 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Oct 9 02:43:59.395738 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Oct 9 02:43:59.398673 sh[593]: Success Oct 9 02:43:59.411463 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Oct 9 02:43:59.450858 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Oct 9 02:43:59.453707 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Oct 9 02:43:59.454351 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Oct 9 02:43:59.471988 kernel: BTRFS info (device dm-0): first mount of filesystem 6ed52ce5-b2f8-4d16-8889-677a209bc377 Oct 9 02:43:59.472022 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Oct 9 02:43:59.474718 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Oct 9 02:43:59.474735 kernel: BTRFS info (device dm-0): disabling log replay at mount time Oct 9 02:43:59.476992 kernel: BTRFS info (device dm-0): using free space tree Oct 9 02:43:59.484628 kernel: BTRFS info (device dm-0): enabling ssd optimizations Oct 9 02:43:59.486151 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Oct 9 02:43:59.487249 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Oct 9 02:43:59.492738 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Oct 9 02:43:59.496152 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Oct 9 02:43:59.508617 kernel: BTRFS info (device sda6): first mount of filesystem 7abc21fd-6b75-4be0-8205-dc564a91a608 Oct 9 02:43:59.511635 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 9 02:43:59.511657 kernel: BTRFS info (device sda6): using free space tree Oct 9 02:43:59.516912 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 9 02:43:59.516933 kernel: BTRFS info (device sda6): auto enabling async discard Oct 9 02:43:59.527083 kernel: BTRFS info (device sda6): last unmount of filesystem 7abc21fd-6b75-4be0-8205-dc564a91a608 Oct 9 02:43:59.526827 systemd[1]: mnt-oem.mount: Deactivated successfully. Oct 9 02:43:59.532121 systemd[1]: Finished ignition-setup.service - Ignition (setup). Oct 9 02:43:59.538796 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Oct 9 02:43:59.607448 ignition[701]: Ignition 2.19.0 Oct 9 02:43:59.608176 ignition[701]: Stage: fetch-offline Oct 9 02:43:59.608212 ignition[701]: no configs at "/usr/lib/ignition/base.d" Oct 9 02:43:59.608222 ignition[701]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:43:59.608342 ignition[701]: parsed url from cmdline: "" Oct 9 02:43:59.608346 ignition[701]: no config URL provided Oct 9 02:43:59.608350 ignition[701]: reading system config file "/usr/lib/ignition/user.ign" Oct 9 02:43:59.608359 ignition[701]: no config at "/usr/lib/ignition/user.ign" Oct 9 02:43:59.612382 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Oct 9 02:43:59.608364 ignition[701]: failed to fetch config: resource requires networking Oct 9 02:43:59.608550 ignition[701]: Ignition finished successfully Oct 9 02:43:59.615530 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 9 02:43:59.620759 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 9 02:43:59.647720 systemd-networkd[780]: lo: Link UP Oct 9 02:43:59.647728 systemd-networkd[780]: lo: Gained carrier Oct 9 02:43:59.650075 systemd-networkd[780]: Enumeration completed Oct 9 02:43:59.650275 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 9 02:43:59.650866 systemd[1]: Reached target network.target - Network. Oct 9 02:43:59.651742 systemd-networkd[780]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:43:59.651746 systemd-networkd[780]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 9 02:43:59.654509 systemd-networkd[780]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:43:59.654513 systemd-networkd[780]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 9 02:43:59.656591 systemd-networkd[780]: eth0: Link UP Oct 9 02:43:59.656595 systemd-networkd[780]: eth0: Gained carrier Oct 9 02:43:59.656667 systemd-networkd[780]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:43:59.657084 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Oct 9 02:43:59.660293 systemd-networkd[780]: eth1: Link UP Oct 9 02:43:59.660297 systemd-networkd[780]: eth1: Gained carrier Oct 9 02:43:59.660304 systemd-networkd[780]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:43:59.668325 ignition[782]: Ignition 2.19.0 Oct 9 02:43:59.668980 ignition[782]: Stage: fetch Oct 9 02:43:59.669136 ignition[782]: no configs at "/usr/lib/ignition/base.d" Oct 9 02:43:59.669147 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:43:59.669222 ignition[782]: parsed url from cmdline: "" Oct 9 02:43:59.669226 ignition[782]: no config URL provided Oct 9 02:43:59.669231 ignition[782]: reading system config file "/usr/lib/ignition/user.ign" Oct 9 02:43:59.669239 ignition[782]: no config at "/usr/lib/ignition/user.ign" Oct 9 02:43:59.669259 ignition[782]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Oct 9 02:43:59.669398 ignition[782]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Oct 9 02:43:59.694647 systemd-networkd[780]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Oct 9 02:43:59.804667 systemd-networkd[780]: eth0: DHCPv4 address 188.245.48.63/32, gateway 172.31.1.1 acquired from 172.31.1.1 Oct 9 02:43:59.870500 ignition[782]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Oct 9 02:43:59.874172 ignition[782]: GET result: OK Oct 9 02:43:59.874281 ignition[782]: parsing config with SHA512: 371a92d8c3699b1720f068d31404cf51bcf27ca34e1d00124eac2824c81ae172ab3d0cdda607b52cc89884c4686f6d566f9983075a3e0bcef9690bc3b9d873ce Oct 9 02:43:59.878152 unknown[782]: fetched base config from "system" Oct 9 02:43:59.878173 unknown[782]: fetched base config from "system" Oct 9 02:43:59.878542 ignition[782]: fetch: fetch complete Oct 9 02:43:59.878184 unknown[782]: fetched user config from "hetzner" Oct 9 02:43:59.878548 ignition[782]: fetch: fetch passed Oct 9 02:43:59.881813 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Oct 9 02:43:59.878592 ignition[782]: Ignition finished successfully Oct 9 02:43:59.887798 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Oct 9 02:43:59.902251 ignition[789]: Ignition 2.19.0 Oct 9 02:43:59.902264 ignition[789]: Stage: kargs Oct 9 02:43:59.902431 ignition[789]: no configs at "/usr/lib/ignition/base.d" Oct 9 02:43:59.902444 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:43:59.906137 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Oct 9 02:43:59.903263 ignition[789]: kargs: kargs passed Oct 9 02:43:59.903310 ignition[789]: Ignition finished successfully Oct 9 02:43:59.921836 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Oct 9 02:43:59.934937 ignition[796]: Ignition 2.19.0 Oct 9 02:43:59.934951 ignition[796]: Stage: disks Oct 9 02:43:59.935158 ignition[796]: no configs at "/usr/lib/ignition/base.d" Oct 9 02:43:59.935170 ignition[796]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:43:59.935875 ignition[796]: disks: disks passed Oct 9 02:43:59.937157 systemd[1]: Finished ignition-disks.service - Ignition (disks). Oct 9 02:43:59.935920 ignition[796]: Ignition finished successfully Oct 9 02:43:59.938346 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Oct 9 02:43:59.939182 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Oct 9 02:43:59.940113 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 9 02:43:59.941126 systemd[1]: Reached target sysinit.target - System Initialization. Oct 9 02:43:59.942147 systemd[1]: Reached target basic.target - Basic System. Oct 9 02:43:59.948858 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Oct 9 02:43:59.964579 systemd-fsck[805]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Oct 9 02:43:59.968153 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Oct 9 02:43:59.972724 systemd[1]: Mounting sysroot.mount - /sysroot... Oct 9 02:44:00.062638 kernel: EXT4-fs (sda9): mounted filesystem ba2945c1-be14-41c0-8c54-84d676c7a16b r/w with ordered data mode. Quota mode: none. Oct 9 02:44:00.063470 systemd[1]: Mounted sysroot.mount - /sysroot. Oct 9 02:44:00.064452 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Oct 9 02:44:00.081802 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 9 02:44:00.084710 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Oct 9 02:44:00.087802 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Oct 9 02:44:00.092678 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Oct 9 02:44:00.095590 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sda6 scanned by mount (813) Oct 9 02:44:00.092761 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Oct 9 02:44:00.098617 kernel: BTRFS info (device sda6): first mount of filesystem 7abc21fd-6b75-4be0-8205-dc564a91a608 Oct 9 02:44:00.098640 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 9 02:44:00.098650 kernel: BTRFS info (device sda6): using free space tree Oct 9 02:44:00.106375 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 9 02:44:00.106424 kernel: BTRFS info (device sda6): auto enabling async discard Oct 9 02:44:00.105973 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Oct 9 02:44:00.113820 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Oct 9 02:44:00.116694 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 9 02:44:00.152903 coreos-metadata[815]: Oct 09 02:44:00.152 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Oct 9 02:44:00.154653 coreos-metadata[815]: Oct 09 02:44:00.154 INFO Fetch successful Oct 9 02:44:00.154653 coreos-metadata[815]: Oct 09 02:44:00.154 INFO wrote hostname ci-4116-0-0-c-ec98df32e3 to /sysroot/etc/hostname Oct 9 02:44:00.157361 initrd-setup-root[840]: cut: /sysroot/etc/passwd: No such file or directory Oct 9 02:44:00.157680 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Oct 9 02:44:00.163632 initrd-setup-root[848]: cut: /sysroot/etc/group: No such file or directory Oct 9 02:44:00.168920 initrd-setup-root[855]: cut: /sysroot/etc/shadow: No such file or directory Oct 9 02:44:00.173411 initrd-setup-root[862]: cut: /sysroot/etc/gshadow: No such file or directory Oct 9 02:44:00.265371 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Oct 9 02:44:00.270706 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Oct 9 02:44:00.272784 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Oct 9 02:44:00.282648 kernel: BTRFS info (device sda6): last unmount of filesystem 7abc21fd-6b75-4be0-8205-dc564a91a608 Oct 9 02:44:00.307327 ignition[934]: INFO : Ignition 2.19.0 Oct 9 02:44:00.307327 ignition[934]: INFO : Stage: mount Oct 9 02:44:00.307327 ignition[934]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 9 02:44:00.307327 ignition[934]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:44:00.310560 ignition[934]: INFO : mount: mount passed Oct 9 02:44:00.310560 ignition[934]: INFO : Ignition finished successfully Oct 9 02:44:00.310108 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Oct 9 02:44:00.312113 systemd[1]: Finished ignition-mount.service - Ignition (mount). Oct 9 02:44:00.316731 systemd[1]: Starting ignition-files.service - Ignition (files)... Oct 9 02:44:00.471060 systemd[1]: sysroot-oem.mount: Deactivated successfully. Oct 9 02:44:00.475780 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 9 02:44:00.489641 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by mount (947) Oct 9 02:44:00.489707 kernel: BTRFS info (device sda6): first mount of filesystem 7abc21fd-6b75-4be0-8205-dc564a91a608 Oct 9 02:44:00.491831 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 9 02:44:00.494313 kernel: BTRFS info (device sda6): using free space tree Oct 9 02:44:00.499476 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 9 02:44:00.499524 kernel: BTRFS info (device sda6): auto enabling async discard Oct 9 02:44:00.502642 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 9 02:44:00.523823 ignition[963]: INFO : Ignition 2.19.0 Oct 9 02:44:00.523823 ignition[963]: INFO : Stage: files Oct 9 02:44:00.525002 ignition[963]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 9 02:44:00.525002 ignition[963]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:44:00.526392 ignition[963]: DEBUG : files: compiled without relabeling support, skipping Oct 9 02:44:00.526997 ignition[963]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Oct 9 02:44:00.526997 ignition[963]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Oct 9 02:44:00.530328 ignition[963]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Oct 9 02:44:00.531005 ignition[963]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Oct 9 02:44:00.531968 ignition[963]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Oct 9 02:44:00.531106 unknown[963]: wrote ssh authorized keys file for user: core Oct 9 02:44:00.533254 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Oct 9 02:44:00.533254 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Oct 9 02:44:00.737126 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Oct 9 02:44:00.947772 systemd-networkd[780]: eth0: Gained IPv6LL Oct 9 02:44:01.011818 systemd-networkd[780]: eth1: Gained IPv6LL Oct 9 02:44:02.378914 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Oct 9 02:44:02.381026 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Oct 9 02:44:02.918673 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Oct 9 02:44:03.176665 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Oct 9 02:44:03.176665 ignition[963]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Oct 9 02:44:03.180466 ignition[963]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Oct 9 02:44:03.180466 ignition[963]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Oct 9 02:44:03.180466 ignition[963]: INFO : files: files passed Oct 9 02:44:03.180466 ignition[963]: INFO : Ignition finished successfully Oct 9 02:44:03.181363 systemd[1]: Finished ignition-files.service - Ignition (files). Oct 9 02:44:03.190311 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Oct 9 02:44:03.194777 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Oct 9 02:44:03.196340 systemd[1]: ignition-quench.service: Deactivated successfully. Oct 9 02:44:03.196442 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Oct 9 02:44:03.207787 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 9 02:44:03.208795 initrd-setup-root-after-ignition[993]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Oct 9 02:44:03.209927 initrd-setup-root-after-ignition[997]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 9 02:44:03.212437 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 9 02:44:03.213837 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Oct 9 02:44:03.218777 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Oct 9 02:44:03.240401 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 9 02:44:03.240526 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Oct 9 02:44:03.241716 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Oct 9 02:44:03.242585 systemd[1]: Reached target initrd.target - Initrd Default Target. Oct 9 02:44:03.243689 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Oct 9 02:44:03.249827 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Oct 9 02:44:03.262014 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 9 02:44:03.266743 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Oct 9 02:44:03.276284 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Oct 9 02:44:03.276893 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 9 02:44:03.277945 systemd[1]: Stopped target timers.target - Timer Units. Oct 9 02:44:03.278945 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 9 02:44:03.279066 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 9 02:44:03.280391 systemd[1]: Stopped target initrd.target - Initrd Default Target. Oct 9 02:44:03.281017 systemd[1]: Stopped target basic.target - Basic System. Oct 9 02:44:03.282039 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Oct 9 02:44:03.283015 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Oct 9 02:44:03.283929 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Oct 9 02:44:03.284971 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Oct 9 02:44:03.286017 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Oct 9 02:44:03.287119 systemd[1]: Stopped target sysinit.target - System Initialization. Oct 9 02:44:03.288123 systemd[1]: Stopped target local-fs.target - Local File Systems. Oct 9 02:44:03.289168 systemd[1]: Stopped target swap.target - Swaps. Oct 9 02:44:03.290109 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 9 02:44:03.290207 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Oct 9 02:44:03.291331 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Oct 9 02:44:03.292012 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 9 02:44:03.292899 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Oct 9 02:44:03.294648 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 9 02:44:03.295746 systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 9 02:44:03.295839 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Oct 9 02:44:03.297189 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Oct 9 02:44:03.297292 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 9 02:44:03.297952 systemd[1]: ignition-files.service: Deactivated successfully. Oct 9 02:44:03.298088 systemd[1]: Stopped ignition-files.service - Ignition (files). Oct 9 02:44:03.298969 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Oct 9 02:44:03.299116 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Oct 9 02:44:03.303821 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Oct 9 02:44:03.305079 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Oct 9 02:44:03.306050 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 9 02:44:03.306568 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Oct 9 02:44:03.310001 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 9 02:44:03.310105 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Oct 9 02:44:03.315975 systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 9 02:44:03.316579 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Oct 9 02:44:03.320265 ignition[1017]: INFO : Ignition 2.19.0 Oct 9 02:44:03.320265 ignition[1017]: INFO : Stage: umount Oct 9 02:44:03.326223 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 9 02:44:03.326223 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Oct 9 02:44:03.326223 ignition[1017]: INFO : umount: umount passed Oct 9 02:44:03.326223 ignition[1017]: INFO : Ignition finished successfully Oct 9 02:44:03.322972 systemd[1]: ignition-mount.service: Deactivated successfully. Oct 9 02:44:03.323122 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Oct 9 02:44:03.324162 systemd[1]: ignition-disks.service: Deactivated successfully. Oct 9 02:44:03.324237 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Oct 9 02:44:03.326705 systemd[1]: ignition-kargs.service: Deactivated successfully. Oct 9 02:44:03.326753 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Oct 9 02:44:03.331726 systemd[1]: ignition-fetch.service: Deactivated successfully. Oct 9 02:44:03.331777 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Oct 9 02:44:03.332249 systemd[1]: Stopped target network.target - Network. Oct 9 02:44:03.334192 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Oct 9 02:44:03.334244 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Oct 9 02:44:03.336723 systemd[1]: Stopped target paths.target - Path Units. Oct 9 02:44:03.337383 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 9 02:44:03.337462 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 9 02:44:03.337946 systemd[1]: Stopped target slices.target - Slice Units. Oct 9 02:44:03.338347 systemd[1]: Stopped target sockets.target - Socket Units. Oct 9 02:44:03.338864 systemd[1]: iscsid.socket: Deactivated successfully. Oct 9 02:44:03.338906 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Oct 9 02:44:03.339364 systemd[1]: iscsiuio.socket: Deactivated successfully. Oct 9 02:44:03.339403 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 9 02:44:03.341705 systemd[1]: ignition-setup.service: Deactivated successfully. Oct 9 02:44:03.341752 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Oct 9 02:44:03.342216 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Oct 9 02:44:03.342259 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Oct 9 02:44:03.345748 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Oct 9 02:44:03.346487 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Oct 9 02:44:03.348430 systemd[1]: sysroot-boot.mount: Deactivated successfully. Oct 9 02:44:03.349070 systemd[1]: sysroot-boot.service: Deactivated successfully. Oct 9 02:44:03.349172 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Oct 9 02:44:03.350511 systemd[1]: initrd-setup-root.service: Deactivated successfully. Oct 9 02:44:03.350593 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Oct 9 02:44:03.352559 systemd[1]: systemd-resolved.service: Deactivated successfully. Oct 9 02:44:03.352703 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Oct 9 02:44:03.353651 systemd-networkd[780]: eth0: DHCPv6 lease lost Oct 9 02:44:03.355576 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 9 02:44:03.355646 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 9 02:44:03.357715 systemd-networkd[780]: eth1: DHCPv6 lease lost Oct 9 02:44:03.359950 systemd[1]: systemd-networkd.service: Deactivated successfully. Oct 9 02:44:03.360080 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Oct 9 02:44:03.361202 systemd[1]: systemd-networkd.socket: Deactivated successfully. Oct 9 02:44:03.361239 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Oct 9 02:44:03.367708 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Oct 9 02:44:03.368384 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Oct 9 02:44:03.368438 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 9 02:44:03.368938 systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 9 02:44:03.368983 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Oct 9 02:44:03.369420 systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 9 02:44:03.369462 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Oct 9 02:44:03.370923 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 9 02:44:03.386467 systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 9 02:44:03.386663 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 9 02:44:03.387638 systemd[1]: network-cleanup.service: Deactivated successfully. Oct 9 02:44:03.387755 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Oct 9 02:44:03.388805 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 9 02:44:03.388866 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Oct 9 02:44:03.389470 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 9 02:44:03.389510 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Oct 9 02:44:03.390354 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 9 02:44:03.390401 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Oct 9 02:44:03.391768 systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 9 02:44:03.391813 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Oct 9 02:44:03.392937 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 9 02:44:03.392986 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 9 02:44:03.400802 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Oct 9 02:44:03.402012 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 9 02:44:03.402106 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 9 02:44:03.402783 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Oct 9 02:44:03.402841 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 9 02:44:03.403511 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 9 02:44:03.403568 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Oct 9 02:44:03.406705 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 9 02:44:03.406771 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:44:03.409078 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 9 02:44:03.409191 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Oct 9 02:44:03.410939 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Oct 9 02:44:03.417001 systemd[1]: Starting initrd-switch-root.service - Switch Root... Oct 9 02:44:03.423568 systemd[1]: Switching root. Oct 9 02:44:03.449811 systemd-journald[187]: Journal stopped Oct 9 02:44:04.496954 systemd-journald[187]: Received SIGTERM from PID 1 (systemd). Oct 9 02:44:04.497030 kernel: SELinux: policy capability network_peer_controls=1 Oct 9 02:44:04.497048 kernel: SELinux: policy capability open_perms=1 Oct 9 02:44:04.497058 kernel: SELinux: policy capability extended_socket_class=1 Oct 9 02:44:04.497070 kernel: SELinux: policy capability always_check_network=0 Oct 9 02:44:04.497079 kernel: SELinux: policy capability cgroup_seclabel=1 Oct 9 02:44:04.497096 kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 9 02:44:04.497111 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Oct 9 02:44:04.497120 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Oct 9 02:44:04.497129 kernel: audit: type=1403 audit(1728441843.624:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 9 02:44:04.497144 systemd[1]: Successfully loaded SELinux policy in 43.018ms. Oct 9 02:44:04.497161 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 14.201ms. Oct 9 02:44:04.497174 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Oct 9 02:44:04.497184 systemd[1]: Detected virtualization kvm. Oct 9 02:44:04.497195 systemd[1]: Detected architecture x86-64. Oct 9 02:44:04.497204 systemd[1]: Detected first boot. Oct 9 02:44:04.497215 systemd[1]: Hostname set to . Oct 9 02:44:04.497225 systemd[1]: Initializing machine ID from VM UUID. Oct 9 02:44:04.497235 zram_generator::config[1064]: No configuration found. Oct 9 02:44:04.497246 systemd[1]: Populated /etc with preset unit settings. Oct 9 02:44:04.497262 systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 9 02:44:04.497273 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Oct 9 02:44:04.497283 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 9 02:44:04.497293 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Oct 9 02:44:04.497309 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Oct 9 02:44:04.497319 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Oct 9 02:44:04.497328 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Oct 9 02:44:04.497338 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Oct 9 02:44:04.497349 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Oct 9 02:44:04.497362 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Oct 9 02:44:04.497372 systemd[1]: Created slice user.slice - User and Session Slice. Oct 9 02:44:04.497382 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 9 02:44:04.497392 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 9 02:44:04.497402 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Oct 9 02:44:04.497412 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Oct 9 02:44:04.497422 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Oct 9 02:44:04.497432 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 9 02:44:04.497445 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Oct 9 02:44:04.497455 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 9 02:44:04.497465 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Oct 9 02:44:04.497475 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Oct 9 02:44:04.497485 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Oct 9 02:44:04.497495 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Oct 9 02:44:04.497507 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 9 02:44:04.497517 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 9 02:44:04.497527 systemd[1]: Reached target slices.target - Slice Units. Oct 9 02:44:04.497537 systemd[1]: Reached target swap.target - Swaps. Oct 9 02:44:04.497547 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Oct 9 02:44:04.497557 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Oct 9 02:44:04.497567 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 9 02:44:04.497577 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 9 02:44:04.497587 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 9 02:44:04.499235 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Oct 9 02:44:04.499269 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Oct 9 02:44:04.499281 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Oct 9 02:44:04.499292 systemd[1]: Mounting media.mount - External Media Directory... Oct 9 02:44:04.499314 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:04.499326 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Oct 9 02:44:04.499339 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Oct 9 02:44:04.499357 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Oct 9 02:44:04.499376 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 9 02:44:04.499393 systemd[1]: Reached target machines.target - Containers. Oct 9 02:44:04.499410 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Oct 9 02:44:04.499428 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 9 02:44:04.499445 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 9 02:44:04.499455 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Oct 9 02:44:04.499466 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 9 02:44:04.499480 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 9 02:44:04.499491 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 9 02:44:04.499501 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 9 02:44:04.499512 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 9 02:44:04.499525 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Oct 9 02:44:04.499535 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 9 02:44:04.499546 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Oct 9 02:44:04.499556 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Oct 9 02:44:04.499568 kernel: fuse: init (API version 7.39) Oct 9 02:44:04.499579 systemd[1]: Stopped systemd-fsck-usr.service. Oct 9 02:44:04.499590 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 9 02:44:04.500669 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 9 02:44:04.500690 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 9 02:44:04.500702 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Oct 9 02:44:04.500713 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 9 02:44:04.500723 systemd[1]: verity-setup.service: Deactivated successfully. Oct 9 02:44:04.500733 systemd[1]: Stopped verity-setup.service. Oct 9 02:44:04.500748 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:04.500758 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Oct 9 02:44:04.500769 kernel: loop: module loaded Oct 9 02:44:04.500781 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Oct 9 02:44:04.500791 systemd[1]: Mounted media.mount - External Media Directory. Oct 9 02:44:04.500801 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Oct 9 02:44:04.500814 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Oct 9 02:44:04.500826 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Oct 9 02:44:04.500836 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Oct 9 02:44:04.500846 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 9 02:44:04.500857 kernel: ACPI: bus type drm_connector registered Oct 9 02:44:04.500867 systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 9 02:44:04.500877 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Oct 9 02:44:04.500887 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 9 02:44:04.500900 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 9 02:44:04.500911 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 9 02:44:04.500921 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 9 02:44:04.500931 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 9 02:44:04.500941 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 9 02:44:04.500953 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 9 02:44:04.500964 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 9 02:44:04.500974 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 9 02:44:04.500984 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 9 02:44:04.500994 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 9 02:44:04.501005 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 9 02:44:04.501017 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Oct 9 02:44:04.501028 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 9 02:44:04.501060 systemd-journald[1136]: Collecting audit messages is disabled. Oct 9 02:44:04.501091 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Oct 9 02:44:04.501107 systemd-journald[1136]: Journal started Oct 9 02:44:04.501127 systemd-journald[1136]: Runtime Journal (/run/log/journal/5c7e26b957e0431096ade91dd3782794) is 4.8M, max 38.4M, 33.6M free. Oct 9 02:44:04.163858 systemd[1]: Queued start job for default target multi-user.target. Oct 9 02:44:04.186201 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Oct 9 02:44:04.186773 systemd[1]: systemd-journald.service: Deactivated successfully. Oct 9 02:44:04.508923 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Oct 9 02:44:04.512464 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Oct 9 02:44:04.512504 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 9 02:44:04.516645 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Oct 9 02:44:04.526690 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Oct 9 02:44:04.538804 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Oct 9 02:44:04.538886 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 9 02:44:04.550190 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Oct 9 02:44:04.554643 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 9 02:44:04.563629 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Oct 9 02:44:04.568650 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 9 02:44:04.580641 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 9 02:44:04.588622 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Oct 9 02:44:04.600637 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 9 02:44:04.611869 systemd[1]: Started systemd-journald.service - Journal Service. Oct 9 02:44:04.616468 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 9 02:44:04.617107 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Oct 9 02:44:04.618595 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Oct 9 02:44:04.620822 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Oct 9 02:44:04.621839 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Oct 9 02:44:04.638625 kernel: loop0: detected capacity change from 0 to 8 Oct 9 02:44:04.649391 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Oct 9 02:44:04.659374 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 9 02:44:04.666078 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Oct 9 02:44:04.675774 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Oct 9 02:44:04.681623 kernel: loop1: detected capacity change from 0 to 211296 Oct 9 02:44:04.684757 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Oct 9 02:44:04.690379 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Oct 9 02:44:04.708946 systemd-tmpfiles[1163]: ACLs are not supported, ignoring. Oct 9 02:44:04.709194 systemd-tmpfiles[1163]: ACLs are not supported, ignoring. Oct 9 02:44:04.710548 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Oct 9 02:44:04.713866 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Oct 9 02:44:04.717990 systemd-journald[1136]: Time spent on flushing to /var/log/journal/5c7e26b957e0431096ade91dd3782794 is 30.894ms for 1148 entries. Oct 9 02:44:04.717990 systemd-journald[1136]: System Journal (/var/log/journal/5c7e26b957e0431096ade91dd3782794) is 8.0M, max 584.8M, 576.8M free. Oct 9 02:44:04.768459 systemd-journald[1136]: Received client request to flush runtime journal. Oct 9 02:44:04.768499 kernel: loop2: detected capacity change from 0 to 138192 Oct 9 02:44:04.725021 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 9 02:44:04.734768 systemd[1]: Starting systemd-sysusers.service - Create System Users... Oct 9 02:44:04.754442 udevadm[1194]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Oct 9 02:44:04.770171 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Oct 9 02:44:04.797643 kernel: loop3: detected capacity change from 0 to 140992 Oct 9 02:44:04.806573 systemd[1]: Finished systemd-sysusers.service - Create System Users. Oct 9 02:44:04.813515 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 9 02:44:04.835159 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. Oct 9 02:44:04.835456 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. Oct 9 02:44:04.847251 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 9 02:44:04.852263 kernel: loop4: detected capacity change from 0 to 8 Oct 9 02:44:04.855742 kernel: loop5: detected capacity change from 0 to 211296 Oct 9 02:44:04.881661 kernel: loop6: detected capacity change from 0 to 138192 Oct 9 02:44:04.901733 kernel: loop7: detected capacity change from 0 to 140992 Oct 9 02:44:04.920622 (sd-merge)[1208]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Oct 9 02:44:04.921257 (sd-merge)[1208]: Merged extensions into '/usr'. Oct 9 02:44:04.925961 systemd[1]: Reloading requested from client PID 1162 ('systemd-sysext') (unit systemd-sysext.service)... Oct 9 02:44:04.926080 systemd[1]: Reloading... Oct 9 02:44:04.998629 zram_generator::config[1233]: No configuration found. Oct 9 02:44:05.136479 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Oct 9 02:44:05.177346 systemd[1]: Reloading finished in 250 ms. Oct 9 02:44:05.209453 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Oct 9 02:44:05.211678 ldconfig[1158]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Oct 9 02:44:05.215807 systemd[1]: Starting ensure-sysext.service... Oct 9 02:44:05.217878 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 9 02:44:05.219940 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Oct 9 02:44:05.231873 systemd[1]: Reloading requested from client PID 1276 ('systemctl') (unit ensure-sysext.service)... Oct 9 02:44:05.231889 systemd[1]: Reloading... Oct 9 02:44:05.243374 systemd-tmpfiles[1277]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 9 02:44:05.244570 systemd-tmpfiles[1277]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 9 02:44:05.245564 systemd-tmpfiles[1277]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 9 02:44:05.245898 systemd-tmpfiles[1277]: ACLs are not supported, ignoring. Oct 9 02:44:05.246013 systemd-tmpfiles[1277]: ACLs are not supported, ignoring. Oct 9 02:44:05.251826 systemd-tmpfiles[1277]: Detected autofs mount point /boot during canonicalization of boot. Oct 9 02:44:05.251900 systemd-tmpfiles[1277]: Skipping /boot Oct 9 02:44:05.265765 systemd-tmpfiles[1277]: Detected autofs mount point /boot during canonicalization of boot. Oct 9 02:44:05.265834 systemd-tmpfiles[1277]: Skipping /boot Oct 9 02:44:05.345616 zram_generator::config[1317]: No configuration found. Oct 9 02:44:05.426292 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Oct 9 02:44:05.467431 systemd[1]: Reloading finished in 235 ms. Oct 9 02:44:05.482008 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Oct 9 02:44:05.482995 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 9 02:44:05.507126 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 9 02:44:05.511551 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Oct 9 02:44:05.515757 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Oct 9 02:44:05.519998 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 9 02:44:05.525556 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 9 02:44:05.528817 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Oct 9 02:44:05.532372 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:05.532518 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 9 02:44:05.540821 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 9 02:44:05.550998 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 9 02:44:05.556922 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 9 02:44:05.557807 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 9 02:44:05.559732 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:05.568789 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Oct 9 02:44:05.570624 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 9 02:44:05.570835 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 9 02:44:05.578725 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 9 02:44:05.579098 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 9 02:44:05.580864 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 9 02:44:05.581397 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 9 02:44:05.595489 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Oct 9 02:44:05.597186 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:05.598905 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 9 02:44:05.605877 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 9 02:44:05.609826 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 9 02:44:05.613850 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 9 02:44:05.616441 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 9 02:44:05.617699 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 9 02:44:05.617775 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:05.618502 systemd[1]: Finished ensure-sysext.service. Oct 9 02:44:05.626814 systemd-udevd[1357]: Using default interface naming scheme 'v255'. Oct 9 02:44:05.631752 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Oct 9 02:44:05.637972 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 9 02:44:05.639663 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 9 02:44:05.655152 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Oct 9 02:44:05.664830 systemd[1]: Starting systemd-update-done.service - Update is Completed... Oct 9 02:44:05.665487 systemd[1]: Started systemd-userdbd.service - User Database Manager. Oct 9 02:44:05.666541 augenrules[1392]: No rules Oct 9 02:44:05.667283 systemd[1]: audit-rules.service: Deactivated successfully. Oct 9 02:44:05.667698 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 9 02:44:05.676369 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 9 02:44:05.689024 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 9 02:44:05.690101 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 9 02:44:05.690681 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 9 02:44:05.693699 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 9 02:44:05.695291 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 9 02:44:05.697647 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 9 02:44:05.699049 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 9 02:44:05.699199 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 9 02:44:05.700404 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 9 02:44:05.704585 systemd[1]: Finished systemd-update-done.service - Update is Completed. Oct 9 02:44:05.720452 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Oct 9 02:44:05.722456 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 9 02:44:05.821636 kernel: BTRFS info: devid 1 device path /dev/mapper/usr changed to /dev/dm-0 scanned by (udev-worker) (1405) Oct 9 02:44:05.822251 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Oct 9 02:44:05.825632 kernel: BTRFS info: devid 1 device path /dev/dm-0 changed to /dev/mapper/usr scanned by (udev-worker) (1405) Oct 9 02:44:05.878022 systemd-networkd[1402]: lo: Link UP Oct 9 02:44:05.881640 systemd-networkd[1402]: lo: Gained carrier Oct 9 02:44:05.890785 systemd-networkd[1402]: Enumeration completed Oct 9 02:44:05.890911 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 9 02:44:05.897785 systemd-networkd[1402]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:44:05.897794 systemd-networkd[1402]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 9 02:44:05.901189 systemd-networkd[1402]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:44:05.901198 systemd-networkd[1402]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 9 02:44:05.901694 systemd-networkd[1402]: eth0: Link UP Oct 9 02:44:05.901698 systemd-networkd[1402]: eth0: Gained carrier Oct 9 02:44:05.901710 systemd-networkd[1402]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:44:05.902056 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Oct 9 02:44:05.904455 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Oct 9 02:44:05.905745 systemd[1]: Reached target time-set.target - System Time Set. Oct 9 02:44:05.907864 systemd-networkd[1402]: eth1: Link UP Oct 9 02:44:05.907868 systemd-networkd[1402]: eth1: Gained carrier Oct 9 02:44:05.907880 systemd-networkd[1402]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:44:05.913957 systemd-resolved[1354]: Positive Trust Anchors: Oct 9 02:44:05.914889 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 9 02:44:05.914988 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 9 02:44:05.920589 systemd-resolved[1354]: Using system hostname 'ci-4116-0-0-c-ec98df32e3'. Oct 9 02:44:05.924394 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 9 02:44:05.925722 systemd[1]: Reached target network.target - Network. Oct 9 02:44:05.927515 systemd-networkd[1402]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Oct 9 02:44:05.927592 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 9 02:44:05.942732 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1427) Oct 9 02:44:05.949787 systemd-networkd[1402]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Oct 9 02:44:05.951288 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Oct 9 02:44:05.967696 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Oct 9 02:44:05.978096 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Oct 9 02:44:05.980863 kernel: ACPI: button: Power Button [PWRF] Oct 9 02:44:05.986040 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Oct 9 02:44:05.997629 kernel: mousedev: PS/2 mouse device common for all mice Oct 9 02:44:06.004576 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Oct 9 02:44:06.012294 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Oct 9 02:44:06.012334 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:06.012435 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 9 02:44:06.017764 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 9 02:44:06.021061 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 9 02:44:06.024557 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 9 02:44:06.026729 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 9 02:44:06.026770 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 9 02:44:06.026783 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 9 02:44:06.027206 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 9 02:44:06.027390 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 9 02:44:06.033672 systemd-networkd[1402]: eth0: DHCPv4 address 188.245.48.63/32, gateway 172.31.1.1 acquired from 172.31.1.1 Oct 9 02:44:06.035687 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Oct 9 02:44:06.041880 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 9 02:44:06.042071 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 9 02:44:06.042627 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Oct 9 02:44:06.045030 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 9 02:44:06.045223 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 9 02:44:06.048653 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 9 02:44:06.048724 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 9 02:44:06.056646 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Oct 9 02:44:06.058010 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Oct 9 02:44:06.059852 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Oct 9 02:44:06.066994 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Oct 9 02:44:06.067021 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Oct 9 02:44:06.070003 kernel: Console: switching to colour dummy device 80x25 Oct 9 02:44:06.071023 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Oct 9 02:44:06.071064 kernel: [drm] features: -context_init Oct 9 02:44:06.073628 kernel: [drm] number of scanouts: 1 Oct 9 02:44:06.073654 kernel: [drm] number of cap sets: 0 Oct 9 02:44:06.075636 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Oct 9 02:44:06.085512 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Oct 9 02:44:06.085577 kernel: Console: switching to colour frame buffer device 160x50 Oct 9 02:44:06.095640 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Oct 9 02:44:06.109654 kernel: EDAC MC: Ver: 3.0.0 Oct 9 02:44:06.110866 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 9 02:44:06.120403 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 9 02:44:06.120654 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:44:06.127760 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 9 02:44:06.132882 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 9 02:44:06.133140 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:44:06.140739 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 9 02:44:06.197154 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 9 02:44:06.273870 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Oct 9 02:44:06.283012 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Oct 9 02:44:06.297812 lvm[1472]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Oct 9 02:44:06.330157 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Oct 9 02:44:06.332010 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 9 02:44:06.332766 systemd[1]: Reached target sysinit.target - System Initialization. Oct 9 02:44:06.332991 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Oct 9 02:44:06.333106 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Oct 9 02:44:06.333396 systemd[1]: Started logrotate.timer - Daily rotation of log files. Oct 9 02:44:06.333614 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Oct 9 02:44:06.333696 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Oct 9 02:44:06.333762 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Oct 9 02:44:06.333786 systemd[1]: Reached target paths.target - Path Units. Oct 9 02:44:06.333841 systemd[1]: Reached target timers.target - Timer Units. Oct 9 02:44:06.341697 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Oct 9 02:44:06.343766 systemd[1]: Starting docker.socket - Docker Socket for the API... Oct 9 02:44:06.351073 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Oct 9 02:44:06.352816 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Oct 9 02:44:06.354085 systemd[1]: Listening on docker.socket - Docker Socket for the API. Oct 9 02:44:06.355124 systemd[1]: Reached target sockets.target - Socket Units. Oct 9 02:44:06.356739 systemd[1]: Reached target basic.target - Basic System. Oct 9 02:44:06.358831 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Oct 9 02:44:06.358873 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Oct 9 02:44:06.360721 systemd[1]: Starting containerd.service - containerd container runtime... Oct 9 02:44:06.362310 lvm[1476]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Oct 9 02:44:06.364776 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Oct 9 02:44:06.375814 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Oct 9 02:44:06.382161 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Oct 9 02:44:06.393771 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Oct 9 02:44:06.394324 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Oct 9 02:44:06.397531 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Oct 9 02:44:06.400951 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Oct 9 02:44:06.405737 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Oct 9 02:44:06.421434 coreos-metadata[1478]: Oct 09 02:44:06.415 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Oct 9 02:44:06.421434 coreos-metadata[1478]: Oct 09 02:44:06.416 INFO Fetch successful Oct 9 02:44:06.421434 coreos-metadata[1478]: Oct 09 02:44:06.418 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Oct 9 02:44:06.421434 coreos-metadata[1478]: Oct 09 02:44:06.419 INFO Fetch successful Oct 9 02:44:06.416720 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Oct 9 02:44:06.422140 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Oct 9 02:44:06.430708 jq[1480]: false Oct 9 02:44:06.435720 systemd[1]: Starting systemd-logind.service - User Login Management... Oct 9 02:44:06.437346 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Oct 9 02:44:06.437786 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Oct 9 02:44:06.440735 systemd[1]: Starting update-engine.service - Update Engine... Oct 9 02:44:06.439905 dbus-daemon[1479]: [system] SELinux support is enabled Oct 9 02:44:06.442885 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Oct 9 02:44:06.446484 systemd[1]: Started dbus.service - D-Bus System Message Bus. Oct 9 02:44:06.450555 extend-filesystems[1481]: Found loop4 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found loop5 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found loop6 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found loop7 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda1 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda2 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda3 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found usr Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda4 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda6 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda7 Oct 9 02:44:06.450555 extend-filesystems[1481]: Found sda9 Oct 9 02:44:06.450555 extend-filesystems[1481]: Checking size of /dev/sda9 Oct 9 02:44:06.547165 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1420) Oct 9 02:44:06.547196 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Oct 9 02:44:06.547254 extend-filesystems[1481]: Resized partition /dev/sda9 Oct 9 02:44:06.454857 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Oct 9 02:44:06.558545 extend-filesystems[1508]: resize2fs 1.47.1 (20-May-2024) Oct 9 02:44:06.471096 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Oct 9 02:44:06.571385 jq[1498]: true Oct 9 02:44:06.473487 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Oct 9 02:44:06.491134 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Oct 9 02:44:06.491176 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Oct 9 02:44:06.496617 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Oct 9 02:44:06.496638 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Oct 9 02:44:06.509851 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Oct 9 02:44:06.510091 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Oct 9 02:44:06.514791 systemd[1]: motdgen.service: Deactivated successfully. Oct 9 02:44:06.515026 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Oct 9 02:44:06.564120 (ntainerd)[1509]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Oct 9 02:44:06.580765 jq[1507]: true Oct 9 02:44:06.587614 tar[1506]: linux-amd64/helm Oct 9 02:44:06.588383 update_engine[1497]: I20241009 02:44:06.588317 1497 main.cc:92] Flatcar Update Engine starting Oct 9 02:44:06.598129 update_engine[1497]: I20241009 02:44:06.597222 1497 update_check_scheduler.cc:74] Next update check in 2m15s Oct 9 02:44:06.628716 systemd[1]: Started update-engine.service - Update Engine. Oct 9 02:44:06.642741 systemd[1]: Started locksmithd.service - Cluster reboot manager. Oct 9 02:44:06.687011 systemd-logind[1493]: New seat seat0. Oct 9 02:44:06.702260 systemd-logind[1493]: Watching system buttons on /dev/input/event2 (Power Button) Oct 9 02:44:06.707919 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Oct 9 02:44:06.711468 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Oct 9 02:44:06.711554 systemd-logind[1493]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Oct 9 02:44:06.712452 systemd[1]: Started systemd-logind.service - User Login Management. Oct 9 02:44:06.779464 bash[1552]: Updated "/home/core/.ssh/authorized_keys" Oct 9 02:44:06.774871 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Oct 9 02:44:06.784046 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Oct 9 02:44:06.791894 systemd[1]: Starting sshkeys.service... Oct 9 02:44:06.801676 extend-filesystems[1508]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Oct 9 02:44:06.801676 extend-filesystems[1508]: old_desc_blocks = 1, new_desc_blocks = 5 Oct 9 02:44:06.801676 extend-filesystems[1508]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Oct 9 02:44:06.809460 extend-filesystems[1481]: Resized filesystem in /dev/sda9 Oct 9 02:44:06.809460 extend-filesystems[1481]: Found sr0 Oct 9 02:44:06.806055 systemd[1]: extend-filesystems.service: Deactivated successfully. Oct 9 02:44:06.806267 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Oct 9 02:44:06.837444 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Oct 9 02:44:06.846997 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Oct 9 02:44:06.896355 coreos-metadata[1562]: Oct 09 02:44:06.896 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Oct 9 02:44:06.899301 coreos-metadata[1562]: Oct 09 02:44:06.898 INFO Fetch successful Oct 9 02:44:06.901108 unknown[1562]: wrote ssh authorized keys file for user: core Oct 9 02:44:06.907906 locksmithd[1532]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Oct 9 02:44:06.912828 containerd[1509]: time="2024-10-09T02:44:06.912066496Z" level=info msg="starting containerd" revision=b2ce781edcbd6cb758f172ecab61c79d607cc41d version=v1.7.22 Oct 9 02:44:06.919550 sshd_keygen[1519]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Oct 9 02:44:06.936272 update-ssh-keys[1568]: Updated "/home/core/.ssh/authorized_keys" Oct 9 02:44:06.937835 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Oct 9 02:44:06.942907 systemd[1]: Finished sshkeys.service. Oct 9 02:44:06.954060 containerd[1509]: time="2024-10-09T02:44:06.952789272Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.954897 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Oct 9 02:44:06.956465 containerd[1509]: time="2024-10-09T02:44:06.956433928Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.54-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Oct 9 02:44:06.956543 containerd[1509]: time="2024-10-09T02:44:06.956527063Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Oct 9 02:44:06.956632 containerd[1509]: time="2024-10-09T02:44:06.956616480Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Oct 9 02:44:06.956847 containerd[1509]: time="2024-10-09T02:44:06.956831353Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Oct 9 02:44:06.956900 containerd[1509]: time="2024-10-09T02:44:06.956888981Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.957017 containerd[1509]: time="2024-10-09T02:44:06.957000250Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Oct 9 02:44:06.957156 containerd[1509]: time="2024-10-09T02:44:06.957142988Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.957439 containerd[1509]: time="2024-10-09T02:44:06.957420277Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Oct 9 02:44:06.957825 containerd[1509]: time="2024-10-09T02:44:06.957810340Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.958061 containerd[1509]: time="2024-10-09T02:44:06.958046933Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Oct 9 02:44:06.958104 containerd[1509]: time="2024-10-09T02:44:06.958092969Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.958243 containerd[1509]: time="2024-10-09T02:44:06.958227822Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.958571 containerd[1509]: time="2024-10-09T02:44:06.958554665Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Oct 9 02:44:06.959173 containerd[1509]: time="2024-10-09T02:44:06.959156083Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Oct 9 02:44:06.959558 containerd[1509]: time="2024-10-09T02:44:06.959543400Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Oct 9 02:44:06.959742 containerd[1509]: time="2024-10-09T02:44:06.959725621Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Oct 9 02:44:06.959887 containerd[1509]: time="2024-10-09T02:44:06.959873198Z" level=info msg="metadata content store policy set" policy=shared Oct 9 02:44:06.968008 containerd[1509]: time="2024-10-09T02:44:06.967957500Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Oct 9 02:44:06.968224 containerd[1509]: time="2024-10-09T02:44:06.968204624Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968316744Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968338114Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968353283Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968478487Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968696837Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968798077Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968811903Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968824958Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968836519Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968849143Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968859282Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968869882Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968881904Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970581 containerd[1509]: time="2024-10-09T02:44:06.968893045Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.968790 systemd[1]: Starting issuegen.service - Generate /run/issue... Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968902864Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968912782Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968929333Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968940403Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968950273Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968960551Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968970240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.968986911Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.969000968Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.969010816Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.969021897Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.969034711Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.969043667Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.970878 containerd[1509]: time="2024-10-09T02:44:06.969053446Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969064346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969076109Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969092970Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969103360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969112827Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969152722Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969165385Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969174222Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969183840Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969192556Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969203598Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969211843Z" level=info msg="NRI interface is disabled by configuration." Oct 9 02:44:06.971098 containerd[1509]: time="2024-10-09T02:44:06.969220259Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Oct 9 02:44:06.971283 containerd[1509]: time="2024-10-09T02:44:06.969440732Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Oct 9 02:44:06.971283 containerd[1509]: time="2024-10-09T02:44:06.969481008Z" level=info msg="Connect containerd service" Oct 9 02:44:06.971283 containerd[1509]: time="2024-10-09T02:44:06.969513428Z" level=info msg="using legacy CRI server" Oct 9 02:44:06.971283 containerd[1509]: time="2024-10-09T02:44:06.969519169Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Oct 9 02:44:06.971283 containerd[1509]: time="2024-10-09T02:44:06.969596014Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Oct 9 02:44:06.971283 containerd[1509]: time="2024-10-09T02:44:06.970147017Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 9 02:44:06.973429 containerd[1509]: time="2024-10-09T02:44:06.973402673Z" level=info msg="Start subscribing containerd event" Oct 9 02:44:06.973562 containerd[1509]: time="2024-10-09T02:44:06.973547535Z" level=info msg="Start recovering state" Oct 9 02:44:06.974058 containerd[1509]: time="2024-10-09T02:44:06.973873375Z" level=info msg="Start event monitor" Oct 9 02:44:06.974058 containerd[1509]: time="2024-10-09T02:44:06.973902109Z" level=info msg="Start snapshots syncer" Oct 9 02:44:06.974058 containerd[1509]: time="2024-10-09T02:44:06.973911698Z" level=info msg="Start cni network conf syncer for default" Oct 9 02:44:06.974058 containerd[1509]: time="2024-10-09T02:44:06.973919292Z" level=info msg="Start streaming server" Oct 9 02:44:06.974955 containerd[1509]: time="2024-10-09T02:44:06.974685759Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Oct 9 02:44:06.975078 containerd[1509]: time="2024-10-09T02:44:06.975050703Z" level=info msg=serving... address=/run/containerd/containerd.sock Oct 9 02:44:06.975729 systemd[1]: issuegen.service: Deactivated successfully. Oct 9 02:44:06.975925 systemd[1]: Finished issuegen.service - Generate /run/issue. Oct 9 02:44:06.977516 containerd[1509]: time="2024-10-09T02:44:06.977497994Z" level=info msg="containerd successfully booted in 0.067666s" Oct 9 02:44:06.979075 systemd[1]: Started containerd.service - containerd container runtime. Oct 9 02:44:06.986846 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Oct 9 02:44:06.999631 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Oct 9 02:44:07.007440 systemd[1]: Started getty@tty1.service - Getty on tty1. Oct 9 02:44:07.011354 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Oct 9 02:44:07.013750 systemd[1]: Reached target getty.target - Login Prompts. Oct 9 02:44:07.091807 systemd-networkd[1402]: eth1: Gained IPv6LL Oct 9 02:44:07.093365 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Oct 9 02:44:07.095803 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Oct 9 02:44:07.098803 systemd[1]: Reached target network-online.target - Network is Online. Oct 9 02:44:07.113399 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:44:07.119708 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Oct 9 02:44:07.155191 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Oct 9 02:44:07.271943 tar[1506]: linux-amd64/LICENSE Oct 9 02:44:07.272075 tar[1506]: linux-amd64/README.md Oct 9 02:44:07.282272 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Oct 9 02:44:07.795880 systemd-networkd[1402]: eth0: Gained IPv6LL Oct 9 02:44:07.796789 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Oct 9 02:44:07.836792 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:44:07.837839 systemd[1]: Reached target multi-user.target - Multi-User System. Oct 9 02:44:07.840559 systemd[1]: Startup finished in 1.195s (kernel) + 6.934s (initrd) + 4.257s (userspace) = 12.386s. Oct 9 02:44:07.843380 (kubelet)[1608]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:44:08.467888 kubelet[1608]: E1009 02:44:08.467795 1608 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:44:08.472123 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:44:08.472304 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:44:18.584393 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Oct 9 02:44:18.590130 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:44:18.712788 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:44:18.712940 (kubelet)[1627]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:44:18.747255 kubelet[1627]: E1009 02:44:18.747102 1627 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:44:18.755031 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:44:18.755217 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:44:28.834321 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Oct 9 02:44:28.839763 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:44:28.962957 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:44:28.966961 (kubelet)[1643]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:44:29.010072 kubelet[1643]: E1009 02:44:29.009957 1643 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:44:29.013577 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:44:29.013824 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:44:38.786424 systemd-timesyncd[1386]: Contacted time server 176.9.157.155:123 (2.flatcar.pool.ntp.org). Oct 9 02:44:38.786532 systemd-timesyncd[1386]: Initial clock synchronization to Wed 2024-10-09 02:44:38.786213 UTC. Oct 9 02:44:38.786665 systemd-resolved[1354]: Clock change detected. Flushing caches. Oct 9 02:44:39.729260 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Oct 9 02:44:39.734626 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:44:39.867422 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:44:39.875792 (kubelet)[1659]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:44:39.918692 kubelet[1659]: E1009 02:44:39.918637 1659 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:44:39.922617 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:44:39.922802 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:44:49.979209 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Oct 9 02:44:49.984621 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:44:50.103244 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:44:50.107806 (kubelet)[1675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:44:50.153013 kubelet[1675]: E1009 02:44:50.152957 1675 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:44:50.157413 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:44:50.157634 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:44:52.143357 update_engine[1497]: I20241009 02:44:52.143210 1497 update_attempter.cc:509] Updating boot flags... Oct 9 02:44:52.192315 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1693) Oct 9 02:44:52.257474 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1696) Oct 9 02:44:52.303475 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1696) Oct 9 02:45:00.229036 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Oct 9 02:45:00.234767 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:45:00.358551 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:45:00.362471 (kubelet)[1713]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:45:00.400043 kubelet[1713]: E1009 02:45:00.399965 1713 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:45:00.404035 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:45:00.404266 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:45:10.479250 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Oct 9 02:45:10.484838 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:45:10.609354 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:45:10.613164 (kubelet)[1730]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:45:10.656557 kubelet[1730]: E1009 02:45:10.656490 1730 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:45:10.660218 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:45:10.660407 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:45:20.729220 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Oct 9 02:45:20.734607 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:45:20.866604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:45:20.867912 (kubelet)[1747]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:45:20.912977 kubelet[1747]: E1009 02:45:20.912846 1747 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:45:20.916797 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:45:20.917046 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:45:30.979200 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Oct 9 02:45:30.984616 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:45:31.114054 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:45:31.126718 (kubelet)[1763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:45:31.169910 kubelet[1763]: E1009 02:45:31.169794 1763 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:45:31.174004 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:45:31.174263 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:45:41.229163 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Oct 9 02:45:41.235626 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:45:41.358984 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:45:41.370698 (kubelet)[1780]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:45:41.415689 kubelet[1780]: E1009 02:45:41.415584 1780 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:45:41.419278 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:45:41.419495 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:45:51.479381 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Oct 9 02:45:51.486611 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:45:51.609261 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:45:51.613452 (kubelet)[1795]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:45:51.652955 kubelet[1795]: E1009 02:45:51.652849 1795 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:45:51.656520 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:45:51.656715 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:46:01.729200 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Oct 9 02:46:01.734925 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:01.886199 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:01.890534 (kubelet)[1812]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:46:01.933697 kubelet[1812]: E1009 02:46:01.933626 1812 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:46:01.937758 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:46:01.937950 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:46:06.096026 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Oct 9 02:46:06.100689 systemd[1]: Started sshd@0-188.245.48.63:22-139.178.68.195:59100.service - OpenSSH per-connection server daemon (139.178.68.195:59100). Oct 9 02:46:07.117563 sshd[1821]: Accepted publickey for core from 139.178.68.195 port 59100 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:07.121848 sshd[1821]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:07.137713 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Oct 9 02:46:07.144843 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Oct 9 02:46:07.150473 systemd-logind[1493]: New session 1 of user core. Oct 9 02:46:07.179976 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Oct 9 02:46:07.189105 systemd[1]: Starting user@500.service - User Manager for UID 500... Oct 9 02:46:07.205653 (systemd)[1825]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Oct 9 02:46:07.324054 systemd[1825]: Queued start job for default target default.target. Oct 9 02:46:07.334618 systemd[1825]: Created slice app.slice - User Application Slice. Oct 9 02:46:07.334644 systemd[1825]: Reached target paths.target - Paths. Oct 9 02:46:07.334656 systemd[1825]: Reached target timers.target - Timers. Oct 9 02:46:07.336105 systemd[1825]: Starting dbus.socket - D-Bus User Message Bus Socket... Oct 9 02:46:07.375054 systemd[1825]: Listening on dbus.socket - D-Bus User Message Bus Socket. Oct 9 02:46:07.375375 systemd[1825]: Reached target sockets.target - Sockets. Oct 9 02:46:07.375428 systemd[1825]: Reached target basic.target - Basic System. Oct 9 02:46:07.375623 systemd[1825]: Reached target default.target - Main User Target. Oct 9 02:46:07.375695 systemd[1825]: Startup finished in 159ms. Oct 9 02:46:07.375995 systemd[1]: Started user@500.service - User Manager for UID 500. Oct 9 02:46:07.394627 systemd[1]: Started session-1.scope - Session 1 of User core. Oct 9 02:46:08.100702 systemd[1]: Started sshd@1-188.245.48.63:22-139.178.68.195:59114.service - OpenSSH per-connection server daemon (139.178.68.195:59114). Oct 9 02:46:09.108227 sshd[1836]: Accepted publickey for core from 139.178.68.195 port 59114 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:09.109999 sshd[1836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:09.114959 systemd-logind[1493]: New session 2 of user core. Oct 9 02:46:09.130601 systemd[1]: Started session-2.scope - Session 2 of User core. Oct 9 02:46:09.799504 sshd[1836]: pam_unix(sshd:session): session closed for user core Oct 9 02:46:09.804224 systemd[1]: sshd@1-188.245.48.63:22-139.178.68.195:59114.service: Deactivated successfully. Oct 9 02:46:09.806821 systemd[1]: session-2.scope: Deactivated successfully. Oct 9 02:46:09.807522 systemd-logind[1493]: Session 2 logged out. Waiting for processes to exit. Oct 9 02:46:09.808899 systemd-logind[1493]: Removed session 2. Oct 9 02:46:09.973696 systemd[1]: Started sshd@2-188.245.48.63:22-139.178.68.195:59120.service - OpenSSH per-connection server daemon (139.178.68.195:59120). Oct 9 02:46:10.973810 sshd[1843]: Accepted publickey for core from 139.178.68.195 port 59120 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:10.975547 sshd[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:10.980856 systemd-logind[1493]: New session 3 of user core. Oct 9 02:46:10.983590 systemd[1]: Started session-3.scope - Session 3 of User core. Oct 9 02:46:11.664989 sshd[1843]: pam_unix(sshd:session): session closed for user core Oct 9 02:46:11.671795 systemd[1]: sshd@2-188.245.48.63:22-139.178.68.195:59120.service: Deactivated successfully. Oct 9 02:46:11.676274 systemd[1]: session-3.scope: Deactivated successfully. Oct 9 02:46:11.677564 systemd-logind[1493]: Session 3 logged out. Waiting for processes to exit. Oct 9 02:46:11.679162 systemd-logind[1493]: Removed session 3. Oct 9 02:46:11.847844 systemd[1]: Started sshd@3-188.245.48.63:22-139.178.68.195:45964.service - OpenSSH per-connection server daemon (139.178.68.195:45964). Oct 9 02:46:11.979987 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Oct 9 02:46:11.986884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:12.138101 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:12.144771 (kubelet)[1860]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:46:12.182307 kubelet[1860]: E1009 02:46:12.182238 1860 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:46:12.185999 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:46:12.186241 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:46:12.854289 sshd[1850]: Accepted publickey for core from 139.178.68.195 port 45964 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:12.857560 sshd[1850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:12.866405 systemd-logind[1493]: New session 4 of user core. Oct 9 02:46:12.876734 systemd[1]: Started session-4.scope - Session 4 of User core. Oct 9 02:46:13.573212 sshd[1850]: pam_unix(sshd:session): session closed for user core Oct 9 02:46:13.577309 systemd-logind[1493]: Session 4 logged out. Waiting for processes to exit. Oct 9 02:46:13.578156 systemd[1]: sshd@3-188.245.48.63:22-139.178.68.195:45964.service: Deactivated successfully. Oct 9 02:46:13.580361 systemd[1]: session-4.scope: Deactivated successfully. Oct 9 02:46:13.581350 systemd-logind[1493]: Removed session 4. Oct 9 02:46:13.752402 systemd[1]: Started sshd@4-188.245.48.63:22-139.178.68.195:45980.service - OpenSSH per-connection server daemon (139.178.68.195:45980). Oct 9 02:46:14.751984 sshd[1874]: Accepted publickey for core from 139.178.68.195 port 45980 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:14.753543 sshd[1874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:14.758570 systemd-logind[1493]: New session 5 of user core. Oct 9 02:46:14.763628 systemd[1]: Started session-5.scope - Session 5 of User core. Oct 9 02:46:15.295778 sudo[1877]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Oct 9 02:46:15.296155 sudo[1877]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 9 02:46:15.309314 sudo[1877]: pam_unix(sudo:session): session closed for user root Oct 9 02:46:15.474122 sshd[1874]: pam_unix(sshd:session): session closed for user core Oct 9 02:46:15.478896 systemd-logind[1493]: Session 5 logged out. Waiting for processes to exit. Oct 9 02:46:15.479723 systemd[1]: sshd@4-188.245.48.63:22-139.178.68.195:45980.service: Deactivated successfully. Oct 9 02:46:15.482083 systemd[1]: session-5.scope: Deactivated successfully. Oct 9 02:46:15.483150 systemd-logind[1493]: Removed session 5. Oct 9 02:46:15.647264 systemd[1]: Started sshd@5-188.245.48.63:22-139.178.68.195:45996.service - OpenSSH per-connection server daemon (139.178.68.195:45996). Oct 9 02:46:16.656213 sshd[1882]: Accepted publickey for core from 139.178.68.195 port 45996 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:16.657937 sshd[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:16.662523 systemd-logind[1493]: New session 6 of user core. Oct 9 02:46:16.677590 systemd[1]: Started session-6.scope - Session 6 of User core. Oct 9 02:46:17.188380 sudo[1886]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Oct 9 02:46:17.188971 sudo[1886]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 9 02:46:17.193515 sudo[1886]: pam_unix(sudo:session): session closed for user root Oct 9 02:46:17.200227 sudo[1885]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Oct 9 02:46:17.200700 sudo[1885]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 9 02:46:17.214733 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 9 02:46:17.243906 augenrules[1908]: No rules Oct 9 02:46:17.244810 systemd[1]: audit-rules.service: Deactivated successfully. Oct 9 02:46:17.245155 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 9 02:46:17.246569 sudo[1885]: pam_unix(sudo:session): session closed for user root Oct 9 02:46:17.407864 sshd[1882]: pam_unix(sshd:session): session closed for user core Oct 9 02:46:17.411861 systemd-logind[1493]: Session 6 logged out. Waiting for processes to exit. Oct 9 02:46:17.412599 systemd[1]: sshd@5-188.245.48.63:22-139.178.68.195:45996.service: Deactivated successfully. Oct 9 02:46:17.414644 systemd[1]: session-6.scope: Deactivated successfully. Oct 9 02:46:17.415460 systemd-logind[1493]: Removed session 6. Oct 9 02:46:17.579535 systemd[1]: Started sshd@6-188.245.48.63:22-139.178.68.195:46006.service - OpenSSH per-connection server daemon (139.178.68.195:46006). Oct 9 02:46:18.577224 sshd[1916]: Accepted publickey for core from 139.178.68.195 port 46006 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:46:18.578932 sshd[1916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:46:18.583669 systemd-logind[1493]: New session 7 of user core. Oct 9 02:46:18.588592 systemd[1]: Started session-7.scope - Session 7 of User core. Oct 9 02:46:19.107829 sudo[1919]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Oct 9 02:46:19.108323 sudo[1919]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 9 02:46:19.343924 (dockerd)[1937]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Oct 9 02:46:19.344641 systemd[1]: Starting docker.service - Docker Application Container Engine... Oct 9 02:46:19.556565 dockerd[1937]: time="2024-10-09T02:46:19.556143851Z" level=info msg="Starting up" Oct 9 02:46:19.619742 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1999371706-merged.mount: Deactivated successfully. Oct 9 02:46:19.654013 dockerd[1937]: time="2024-10-09T02:46:19.653767740Z" level=info msg="Loading containers: start." Oct 9 02:46:19.820471 kernel: Initializing XFRM netlink socket Oct 9 02:46:19.906807 systemd-networkd[1402]: docker0: Link UP Oct 9 02:46:19.952730 dockerd[1937]: time="2024-10-09T02:46:19.952690850Z" level=info msg="Loading containers: done." Oct 9 02:46:19.968393 dockerd[1937]: time="2024-10-09T02:46:19.968350236Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Oct 9 02:46:19.968562 dockerd[1937]: time="2024-10-09T02:46:19.968454758Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Oct 9 02:46:19.968562 dockerd[1937]: time="2024-10-09T02:46:19.968557177Z" level=info msg="Daemon has completed initialization" Oct 9 02:46:20.000079 dockerd[1937]: time="2024-10-09T02:46:19.999929703Z" level=info msg="API listen on /run/docker.sock" Oct 9 02:46:20.000894 systemd[1]: Started docker.service - Docker Application Container Engine. Oct 9 02:46:20.921066 containerd[1509]: time="2024-10-09T02:46:20.921008504Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.9\"" Oct 9 02:46:21.516321 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2349295764.mount: Deactivated successfully. Oct 9 02:46:22.101577 update_engine[1497]: I20241009 02:46:22.101496 1497 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Oct 9 02:46:22.101577 update_engine[1497]: I20241009 02:46:22.101555 1497 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Oct 9 02:46:22.102024 update_engine[1497]: I20241009 02:46:22.101758 1497 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Oct 9 02:46:22.102356 update_engine[1497]: I20241009 02:46:22.102319 1497 omaha_request_params.cc:62] Current group set to alpha Oct 9 02:46:22.102591 update_engine[1497]: I20241009 02:46:22.102466 1497 update_attempter.cc:499] Already updated boot flags. Skipping. Oct 9 02:46:22.102591 update_engine[1497]: I20241009 02:46:22.102486 1497 update_attempter.cc:643] Scheduling an action processor start. Oct 9 02:46:22.102591 update_engine[1497]: I20241009 02:46:22.102504 1497 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Oct 9 02:46:22.102591 update_engine[1497]: I20241009 02:46:22.102533 1497 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Oct 9 02:46:22.102740 update_engine[1497]: I20241009 02:46:22.102600 1497 omaha_request_action.cc:271] Posting an Omaha request to disabled Oct 9 02:46:22.102740 update_engine[1497]: I20241009 02:46:22.102610 1497 omaha_request_action.cc:272] Request: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: Oct 9 02:46:22.102740 update_engine[1497]: I20241009 02:46:22.102619 1497 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Oct 9 02:46:22.103196 locksmithd[1532]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Oct 9 02:46:22.103837 update_engine[1497]: I20241009 02:46:22.103802 1497 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Oct 9 02:46:22.104151 update_engine[1497]: I20241009 02:46:22.104115 1497 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Oct 9 02:46:22.104774 update_engine[1497]: E20241009 02:46:22.104739 1497 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Oct 9 02:46:22.104819 update_engine[1497]: I20241009 02:46:22.104804 1497 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Oct 9 02:46:22.230063 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Oct 9 02:46:22.244555 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:22.373664 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:22.382025 (kubelet)[2192]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:46:22.428145 kubelet[2192]: E1009 02:46:22.427845 2192 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:46:22.431985 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:46:22.432167 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:46:22.906691 containerd[1509]: time="2024-10-09T02:46:22.906648811Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:22.908064 containerd[1509]: time="2024-10-09T02:46:22.908042785Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.9: active requests=0, bytes read=35213933" Oct 9 02:46:22.908657 containerd[1509]: time="2024-10-09T02:46:22.908239825Z" level=info msg="ImageCreate event name:\"sha256:bc1ec5c2b6c60a3b18e7f54a99f0452c038400ecaaa2576931fd5342a0586abb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:22.910685 containerd[1509]: time="2024-10-09T02:46:22.910640442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:b88538e7fdf73583c8670540eec5b3620af75c9ec200434a5815ee7fba5021f3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:22.911589 containerd[1509]: time="2024-10-09T02:46:22.911540702Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.9\" with image id \"sha256:bc1ec5c2b6c60a3b18e7f54a99f0452c038400ecaaa2576931fd5342a0586abb\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:b88538e7fdf73583c8670540eec5b3620af75c9ec200434a5815ee7fba5021f3\", size \"35210641\" in 1.990492101s" Oct 9 02:46:22.913447 containerd[1509]: time="2024-10-09T02:46:22.911697264Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.9\" returns image reference \"sha256:bc1ec5c2b6c60a3b18e7f54a99f0452c038400ecaaa2576931fd5342a0586abb\"" Oct 9 02:46:22.934780 containerd[1509]: time="2024-10-09T02:46:22.934722993Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.9\"" Oct 9 02:46:24.614090 containerd[1509]: time="2024-10-09T02:46:24.614023970Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:24.615249 containerd[1509]: time="2024-10-09T02:46:24.615209015Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.9: active requests=0, bytes read=32208693" Oct 9 02:46:24.616321 containerd[1509]: time="2024-10-09T02:46:24.616276985Z" level=info msg="ImageCreate event name:\"sha256:5abda0d0a9153cd1f90fd828be379f7a16a6c814e6efbbbf31e247e13c3843e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:24.618828 containerd[1509]: time="2024-10-09T02:46:24.618763822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f2f18973ccb6996687d10ba5bd1b8f303e3dd2fed80f831a44d2ac8191e5bb9b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:24.619715 containerd[1509]: time="2024-10-09T02:46:24.619594674Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.9\" with image id \"sha256:5abda0d0a9153cd1f90fd828be379f7a16a6c814e6efbbbf31e247e13c3843e5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f2f18973ccb6996687d10ba5bd1b8f303e3dd2fed80f831a44d2ac8191e5bb9b\", size \"33739229\" in 1.68481424s" Oct 9 02:46:24.619715 containerd[1509]: time="2024-10-09T02:46:24.619622949Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.9\" returns image reference \"sha256:5abda0d0a9153cd1f90fd828be379f7a16a6c814e6efbbbf31e247e13c3843e5\"" Oct 9 02:46:24.639233 containerd[1509]: time="2024-10-09T02:46:24.639177715Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.9\"" Oct 9 02:46:25.853452 containerd[1509]: time="2024-10-09T02:46:25.853373359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:25.854656 containerd[1509]: time="2024-10-09T02:46:25.854592489Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.9: active requests=0, bytes read=17320476" Oct 9 02:46:25.855094 containerd[1509]: time="2024-10-09T02:46:25.855057235Z" level=info msg="ImageCreate event name:\"sha256:059957505b3370d4c57d793e79cc70f9063d7ab75767f7040f5cc85572fe7e8d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:25.857717 containerd[1509]: time="2024-10-09T02:46:25.857681984Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c164076eebaefdaebad46a5ccd550e9f38c63588c02d35163c6a09e164ab8a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:25.858795 containerd[1509]: time="2024-10-09T02:46:25.858694185Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.9\" with image id \"sha256:059957505b3370d4c57d793e79cc70f9063d7ab75767f7040f5cc85572fe7e8d\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c164076eebaefdaebad46a5ccd550e9f38c63588c02d35163c6a09e164ab8a8\", size \"18851030\" in 1.219483325s" Oct 9 02:46:25.858795 containerd[1509]: time="2024-10-09T02:46:25.858719774Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.9\" returns image reference \"sha256:059957505b3370d4c57d793e79cc70f9063d7ab75767f7040f5cc85572fe7e8d\"" Oct 9 02:46:25.879733 containerd[1509]: time="2024-10-09T02:46:25.879693857Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.9\"" Oct 9 02:46:27.089391 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount689215398.mount: Deactivated successfully. Oct 9 02:46:27.353158 containerd[1509]: time="2024-10-09T02:46:27.353020540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:27.354076 containerd[1509]: time="2024-10-09T02:46:27.354027818Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.9: active requests=0, bytes read=28601776" Oct 9 02:46:27.355022 containerd[1509]: time="2024-10-09T02:46:27.354975583Z" level=info msg="ImageCreate event name:\"sha256:dd650d127e51776919ec1622a4469a8b141b2dfee5a33fbc5cb9729372e0dcfa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:27.356679 containerd[1509]: time="2024-10-09T02:46:27.356636509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:124040dbe6b5294352355f5d34c692ecbc940cdc57a8fd06d0f38f76b6138906\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:27.357334 containerd[1509]: time="2024-10-09T02:46:27.357184685Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.9\" with image id \"sha256:dd650d127e51776919ec1622a4469a8b141b2dfee5a33fbc5cb9729372e0dcfa\", repo tag \"registry.k8s.io/kube-proxy:v1.29.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:124040dbe6b5294352355f5d34c692ecbc940cdc57a8fd06d0f38f76b6138906\", size \"28600769\" in 1.477458906s" Oct 9 02:46:27.357334 containerd[1509]: time="2024-10-09T02:46:27.357220092Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.9\" returns image reference \"sha256:dd650d127e51776919ec1622a4469a8b141b2dfee5a33fbc5cb9729372e0dcfa\"" Oct 9 02:46:27.379124 containerd[1509]: time="2024-10-09T02:46:27.379071096Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Oct 9 02:46:27.915069 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3066923177.mount: Deactivated successfully. Oct 9 02:46:28.689115 containerd[1509]: time="2024-10-09T02:46:28.689017170Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:28.690157 containerd[1509]: time="2024-10-09T02:46:28.690108739Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185841" Oct 9 02:46:28.691215 containerd[1509]: time="2024-10-09T02:46:28.691165081Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:28.694306 containerd[1509]: time="2024-10-09T02:46:28.694242489Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:28.695618 containerd[1509]: time="2024-10-09T02:46:28.695414674Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.316305384s" Oct 9 02:46:28.695618 containerd[1509]: time="2024-10-09T02:46:28.695521529Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Oct 9 02:46:28.716536 containerd[1509]: time="2024-10-09T02:46:28.716511638Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Oct 9 02:46:29.200427 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2900870124.mount: Deactivated successfully. Oct 9 02:46:29.208323 containerd[1509]: time="2024-10-09T02:46:29.208251712Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:29.209073 containerd[1509]: time="2024-10-09T02:46:29.209028765Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322310" Oct 9 02:46:29.210163 containerd[1509]: time="2024-10-09T02:46:29.210118559Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:29.212201 containerd[1509]: time="2024-10-09T02:46:29.212176134Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:29.213513 containerd[1509]: time="2024-10-09T02:46:29.213042970Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 496.406391ms" Oct 9 02:46:29.213513 containerd[1509]: time="2024-10-09T02:46:29.213073971Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Oct 9 02:46:29.235028 containerd[1509]: time="2024-10-09T02:46:29.234952631Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Oct 9 02:46:29.781751 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount220224568.mount: Deactivated successfully. Oct 9 02:46:32.100474 update_engine[1497]: I20241009 02:46:32.100230 1497 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Oct 9 02:46:32.100474 update_engine[1497]: I20241009 02:46:32.100481 1497 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Oct 9 02:46:32.100912 update_engine[1497]: I20241009 02:46:32.100670 1497 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Oct 9 02:46:32.101455 update_engine[1497]: E20241009 02:46:32.101382 1497 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Oct 9 02:46:32.101455 update_engine[1497]: I20241009 02:46:32.101424 1497 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Oct 9 02:46:32.142498 containerd[1509]: time="2024-10-09T02:46:32.142423449Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:32.143422 containerd[1509]: time="2024-10-09T02:46:32.143384945Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=56651705" Oct 9 02:46:32.144184 containerd[1509]: time="2024-10-09T02:46:32.144145954Z" level=info msg="ImageCreate event name:\"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:32.146631 containerd[1509]: time="2024-10-09T02:46:32.146589622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:32.147540 containerd[1509]: time="2024-10-09T02:46:32.147396861Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"56649232\" in 2.912384857s" Oct 9 02:46:32.147540 containerd[1509]: time="2024-10-09T02:46:32.147424784Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\"" Oct 9 02:46:32.479111 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 14. Oct 9 02:46:32.484579 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:32.605745 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:32.610745 (kubelet)[2355]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 9 02:46:32.672957 kubelet[2355]: E1009 02:46:32.672877 2355 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 9 02:46:32.677349 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 9 02:46:32.678067 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 9 02:46:34.426195 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:34.438772 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:34.460081 systemd[1]: Reloading requested from client PID 2422 ('systemctl') (unit session-7.scope)... Oct 9 02:46:34.460097 systemd[1]: Reloading... Oct 9 02:46:34.607500 zram_generator::config[2468]: No configuration found. Oct 9 02:46:34.700620 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Oct 9 02:46:34.784185 systemd[1]: Reloading finished in 323 ms. Oct 9 02:46:34.851959 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:34.855728 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:34.857824 systemd[1]: kubelet.service: Deactivated successfully. Oct 9 02:46:34.858069 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:34.862612 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:35.000709 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:35.009828 (kubelet)[2518]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 9 02:46:35.054214 kubelet[2518]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 9 02:46:35.054214 kubelet[2518]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 9 02:46:35.054214 kubelet[2518]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 9 02:46:35.054610 kubelet[2518]: I1009 02:46:35.054266 2518 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 9 02:46:35.254115 kubelet[2518]: I1009 02:46:35.254004 2518 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Oct 9 02:46:35.254115 kubelet[2518]: I1009 02:46:35.254032 2518 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 9 02:46:35.254490 kubelet[2518]: I1009 02:46:35.254458 2518 server.go:919] "Client rotation is on, will bootstrap in background" Oct 9 02:46:35.277137 kubelet[2518]: I1009 02:46:35.276879 2518 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 9 02:46:35.278755 kubelet[2518]: E1009 02:46:35.278728 2518 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://188.245.48.63:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.288865 kubelet[2518]: I1009 02:46:35.288840 2518 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 9 02:46:35.290112 kubelet[2518]: I1009 02:46:35.290082 2518 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 9 02:46:35.291157 kubelet[2518]: I1009 02:46:35.291098 2518 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Oct 9 02:46:35.291258 kubelet[2518]: I1009 02:46:35.291162 2518 topology_manager.go:138] "Creating topology manager with none policy" Oct 9 02:46:35.291258 kubelet[2518]: I1009 02:46:35.291178 2518 container_manager_linux.go:301] "Creating device plugin manager" Oct 9 02:46:35.291343 kubelet[2518]: I1009 02:46:35.291316 2518 state_mem.go:36] "Initialized new in-memory state store" Oct 9 02:46:35.291445 kubelet[2518]: I1009 02:46:35.291414 2518 kubelet.go:396] "Attempting to sync node with API server" Oct 9 02:46:35.291445 kubelet[2518]: I1009 02:46:35.291444 2518 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 9 02:46:35.291504 kubelet[2518]: I1009 02:46:35.291470 2518 kubelet.go:312] "Adding apiserver pod source" Oct 9 02:46:35.291504 kubelet[2518]: I1009 02:46:35.291483 2518 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 9 02:46:35.292628 kubelet[2518]: W1009 02:46:35.291943 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://188.245.48.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4116-0-0-c-ec98df32e3&limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.292628 kubelet[2518]: E1009 02:46:35.292000 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://188.245.48.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4116-0-0-c-ec98df32e3&limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.294604 kubelet[2518]: W1009 02:46:35.294027 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://188.245.48.63:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.294604 kubelet[2518]: E1009 02:46:35.294080 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://188.245.48.63:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.294604 kubelet[2518]: I1009 02:46:35.294312 2518 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.22" apiVersion="v1" Oct 9 02:46:35.298786 kubelet[2518]: I1009 02:46:35.298764 2518 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 9 02:46:35.298989 kubelet[2518]: W1009 02:46:35.298963 2518 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Oct 9 02:46:35.302016 kubelet[2518]: I1009 02:46:35.302000 2518 server.go:1256] "Started kubelet" Oct 9 02:46:35.303066 kubelet[2518]: I1009 02:46:35.302765 2518 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Oct 9 02:46:35.303451 kubelet[2518]: I1009 02:46:35.303411 2518 server.go:461] "Adding debug handlers to kubelet server" Oct 9 02:46:35.306459 kubelet[2518]: I1009 02:46:35.306419 2518 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 9 02:46:35.307125 kubelet[2518]: I1009 02:46:35.306661 2518 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 9 02:46:35.308540 kubelet[2518]: I1009 02:46:35.307998 2518 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 9 02:46:35.308540 kubelet[2518]: E1009 02:46:35.308092 2518 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://188.245.48.63:6443/api/v1/namespaces/default/events\": dial tcp 188.245.48.63:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4116-0-0-c-ec98df32e3.17fca8d648d78ccb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4116-0-0-c-ec98df32e3,UID:ci-4116-0-0-c-ec98df32e3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4116-0-0-c-ec98df32e3,},FirstTimestamp:2024-10-09 02:46:35.301981387 +0000 UTC m=+0.287945690,LastTimestamp:2024-10-09 02:46:35.301981387 +0000 UTC m=+0.287945690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4116-0-0-c-ec98df32e3,}" Oct 9 02:46:35.311965 kubelet[2518]: E1009 02:46:35.311845 2518 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4116-0-0-c-ec98df32e3\" not found" Oct 9 02:46:35.311965 kubelet[2518]: I1009 02:46:35.311874 2518 volume_manager.go:291] "Starting Kubelet Volume Manager" Oct 9 02:46:35.317095 kubelet[2518]: I1009 02:46:35.317060 2518 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Oct 9 02:46:35.317186 kubelet[2518]: I1009 02:46:35.317157 2518 reconciler_new.go:29] "Reconciler: start to sync state" Oct 9 02:46:35.318964 kubelet[2518]: E1009 02:46:35.318256 2518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.48.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4116-0-0-c-ec98df32e3?timeout=10s\": dial tcp 188.245.48.63:6443: connect: connection refused" interval="200ms" Oct 9 02:46:35.318964 kubelet[2518]: W1009 02:46:35.318309 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://188.245.48.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.318964 kubelet[2518]: E1009 02:46:35.318351 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://188.245.48.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.318964 kubelet[2518]: I1009 02:46:35.318563 2518 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 9 02:46:35.322463 kubelet[2518]: E1009 02:46:35.320745 2518 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 9 02:46:35.322463 kubelet[2518]: I1009 02:46:35.320792 2518 factory.go:221] Registration of the containerd container factory successfully Oct 9 02:46:35.322463 kubelet[2518]: I1009 02:46:35.320803 2518 factory.go:221] Registration of the systemd container factory successfully Oct 9 02:46:35.327922 kubelet[2518]: I1009 02:46:35.327879 2518 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 9 02:46:35.329083 kubelet[2518]: I1009 02:46:35.329060 2518 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 9 02:46:35.329156 kubelet[2518]: I1009 02:46:35.329091 2518 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 9 02:46:35.329156 kubelet[2518]: I1009 02:46:35.329116 2518 kubelet.go:2329] "Starting kubelet main sync loop" Oct 9 02:46:35.329208 kubelet[2518]: E1009 02:46:35.329161 2518 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 9 02:46:35.336373 kubelet[2518]: W1009 02:46:35.336331 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://188.245.48.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.336486 kubelet[2518]: E1009 02:46:35.336375 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://188.245.48.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:35.353908 kubelet[2518]: I1009 02:46:35.353884 2518 cpu_manager.go:214] "Starting CPU manager" policy="none" Oct 9 02:46:35.354056 kubelet[2518]: I1009 02:46:35.354047 2518 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Oct 9 02:46:35.354131 kubelet[2518]: I1009 02:46:35.354122 2518 state_mem.go:36] "Initialized new in-memory state store" Oct 9 02:46:35.357285 kubelet[2518]: I1009 02:46:35.357263 2518 policy_none.go:49] "None policy: Start" Oct 9 02:46:35.358036 kubelet[2518]: I1009 02:46:35.358016 2518 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 9 02:46:35.358092 kubelet[2518]: I1009 02:46:35.358040 2518 state_mem.go:35] "Initializing new in-memory state store" Oct 9 02:46:35.365737 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Oct 9 02:46:35.374970 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Oct 9 02:46:35.378487 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Oct 9 02:46:35.395469 kubelet[2518]: I1009 02:46:35.395319 2518 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 9 02:46:35.395613 kubelet[2518]: I1009 02:46:35.395592 2518 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 9 02:46:35.400047 kubelet[2518]: E1009 02:46:35.400002 2518 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4116-0-0-c-ec98df32e3\" not found" Oct 9 02:46:35.414087 kubelet[2518]: I1009 02:46:35.414045 2518 kubelet_node_status.go:73] "Attempting to register node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.414453 kubelet[2518]: E1009 02:46:35.414389 2518 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.48.63:6443/api/v1/nodes\": dial tcp 188.245.48.63:6443: connect: connection refused" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.430045 kubelet[2518]: I1009 02:46:35.430002 2518 topology_manager.go:215] "Topology Admit Handler" podUID="c057fdfa450ff52d1c7dd73b45942f05" podNamespace="kube-system" podName="kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.431747 kubelet[2518]: I1009 02:46:35.431713 2518 topology_manager.go:215] "Topology Admit Handler" podUID="429a8cc9f3efc1b4b9bded91c3069686" podNamespace="kube-system" podName="kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.433395 kubelet[2518]: I1009 02:46:35.433006 2518 topology_manager.go:215] "Topology Admit Handler" podUID="2551eb4063efef7f21b712c31e04aa59" podNamespace="kube-system" podName="kube-scheduler-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.439265 systemd[1]: Created slice kubepods-burstable-podc057fdfa450ff52d1c7dd73b45942f05.slice - libcontainer container kubepods-burstable-podc057fdfa450ff52d1c7dd73b45942f05.slice. Oct 9 02:46:35.454467 systemd[1]: Created slice kubepods-burstable-pod429a8cc9f3efc1b4b9bded91c3069686.slice - libcontainer container kubepods-burstable-pod429a8cc9f3efc1b4b9bded91c3069686.slice. Oct 9 02:46:35.462607 systemd[1]: Created slice kubepods-burstable-pod2551eb4063efef7f21b712c31e04aa59.slice - libcontainer container kubepods-burstable-pod2551eb4063efef7f21b712c31e04aa59.slice. Oct 9 02:46:35.519391 kubelet[2518]: E1009 02:46:35.519267 2518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.48.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4116-0-0-c-ec98df32e3?timeout=10s\": dial tcp 188.245.48.63:6443: connect: connection refused" interval="400ms" Oct 9 02:46:35.616552 kubelet[2518]: I1009 02:46:35.616514 2518 kubelet_node_status.go:73] "Attempting to register node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.616912 kubelet[2518]: E1009 02:46:35.616879 2518 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.48.63:6443/api/v1/nodes\": dial tcp 188.245.48.63:6443: connect: connection refused" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618106 kubelet[2518]: I1009 02:46:35.618084 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c057fdfa450ff52d1c7dd73b45942f05-ca-certs\") pod \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" (UID: \"c057fdfa450ff52d1c7dd73b45942f05\") " pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618188 kubelet[2518]: I1009 02:46:35.618115 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c057fdfa450ff52d1c7dd73b45942f05-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" (UID: \"c057fdfa450ff52d1c7dd73b45942f05\") " pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618188 kubelet[2518]: I1009 02:46:35.618133 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-k8s-certs\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618188 kubelet[2518]: I1009 02:46:35.618149 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-kubeconfig\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618188 kubelet[2518]: I1009 02:46:35.618166 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618188 kubelet[2518]: I1009 02:46:35.618181 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c057fdfa450ff52d1c7dd73b45942f05-k8s-certs\") pod \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" (UID: \"c057fdfa450ff52d1c7dd73b45942f05\") " pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618333 kubelet[2518]: I1009 02:46:35.618197 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-ca-certs\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618333 kubelet[2518]: I1009 02:46:35.618213 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-flexvolume-dir\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.618333 kubelet[2518]: I1009 02:46:35.618230 2518 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2551eb4063efef7f21b712c31e04aa59-kubeconfig\") pod \"kube-scheduler-ci-4116-0-0-c-ec98df32e3\" (UID: \"2551eb4063efef7f21b712c31e04aa59\") " pod="kube-system/kube-scheduler-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:35.752902 containerd[1509]: time="2024-10-09T02:46:35.752815874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4116-0-0-c-ec98df32e3,Uid:c057fdfa450ff52d1c7dd73b45942f05,Namespace:kube-system,Attempt:0,}" Oct 9 02:46:35.760654 containerd[1509]: time="2024-10-09T02:46:35.760594532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4116-0-0-c-ec98df32e3,Uid:429a8cc9f3efc1b4b9bded91c3069686,Namespace:kube-system,Attempt:0,}" Oct 9 02:46:35.766306 containerd[1509]: time="2024-10-09T02:46:35.766234474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4116-0-0-c-ec98df32e3,Uid:2551eb4063efef7f21b712c31e04aa59,Namespace:kube-system,Attempt:0,}" Oct 9 02:46:35.920262 kubelet[2518]: E1009 02:46:35.920128 2518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.48.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4116-0-0-c-ec98df32e3?timeout=10s\": dial tcp 188.245.48.63:6443: connect: connection refused" interval="800ms" Oct 9 02:46:36.021983 kubelet[2518]: I1009 02:46:36.021587 2518 kubelet_node_status.go:73] "Attempting to register node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:36.021983 kubelet[2518]: E1009 02:46:36.021901 2518 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.48.63:6443/api/v1/nodes\": dial tcp 188.245.48.63:6443: connect: connection refused" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:36.248668 kubelet[2518]: E1009 02:46:36.248633 2518 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://188.245.48.63:6443/api/v1/namespaces/default/events\": dial tcp 188.245.48.63:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4116-0-0-c-ec98df32e3.17fca8d648d78ccb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4116-0-0-c-ec98df32e3,UID:ci-4116-0-0-c-ec98df32e3,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4116-0-0-c-ec98df32e3,},FirstTimestamp:2024-10-09 02:46:35.301981387 +0000 UTC m=+0.287945690,LastTimestamp:2024-10-09 02:46:35.301981387 +0000 UTC m=+0.287945690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4116-0-0-c-ec98df32e3,}" Oct 9 02:46:36.253095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount159929293.mount: Deactivated successfully. Oct 9 02:46:36.263134 containerd[1509]: time="2024-10-09T02:46:36.263072614Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 9 02:46:36.264053 containerd[1509]: time="2024-10-09T02:46:36.264005199Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312076" Oct 9 02:46:36.265293 containerd[1509]: time="2024-10-09T02:46:36.265250663Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 9 02:46:36.266916 containerd[1509]: time="2024-10-09T02:46:36.266788507Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Oct 9 02:46:36.267767 containerd[1509]: time="2024-10-09T02:46:36.267731693Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 9 02:46:36.268458 containerd[1509]: time="2024-10-09T02:46:36.268394470Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 9 02:46:36.269378 containerd[1509]: time="2024-10-09T02:46:36.269336014Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Oct 9 02:46:36.269707 containerd[1509]: time="2024-10-09T02:46:36.269663691Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 9 02:46:36.270674 containerd[1509]: time="2024-10-09T02:46:36.270402344Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 517.483824ms" Oct 9 02:46:36.273796 containerd[1509]: time="2024-10-09T02:46:36.273750283Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 513.041573ms" Oct 9 02:46:36.276613 containerd[1509]: time="2024-10-09T02:46:36.276567456Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 510.245054ms" Oct 9 02:46:36.333290 kubelet[2518]: W1009 02:46:36.332938 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://188.245.48.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4116-0-0-c-ec98df32e3&limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.335234 kubelet[2518]: E1009 02:46:36.334520 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://188.245.48.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4116-0-0-c-ec98df32e3&limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.395500 containerd[1509]: time="2024-10-09T02:46:36.391380623Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:46:36.395706 containerd[1509]: time="2024-10-09T02:46:36.395420746Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:46:36.395706 containerd[1509]: time="2024-10-09T02:46:36.395538121Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:36.397778 containerd[1509]: time="2024-10-09T02:46:36.396323724Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:36.400055 containerd[1509]: time="2024-10-09T02:46:36.399836629Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:46:36.400055 containerd[1509]: time="2024-10-09T02:46:36.399889801Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:46:36.400055 containerd[1509]: time="2024-10-09T02:46:36.399903116Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:36.400055 containerd[1509]: time="2024-10-09T02:46:36.399978871Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:36.406461 containerd[1509]: time="2024-10-09T02:46:36.405073233Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:46:36.406461 containerd[1509]: time="2024-10-09T02:46:36.405123349Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:46:36.406461 containerd[1509]: time="2024-10-09T02:46:36.405136704Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:36.407125 containerd[1509]: time="2024-10-09T02:46:36.407002276Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:36.428246 systemd[1]: Started cri-containerd-71e35bd83a395861342f75f70606aec37f6e5b126c6f2ffcc920dfe3da078d67.scope - libcontainer container 71e35bd83a395861342f75f70606aec37f6e5b126c6f2ffcc920dfe3da078d67. Oct 9 02:46:36.431926 kubelet[2518]: W1009 02:46:36.431866 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://188.245.48.63:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.431926 kubelet[2518]: E1009 02:46:36.431928 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://188.245.48.63:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.439698 systemd[1]: Started cri-containerd-3e75ac517230044829128789e9e8851ed88c435660e96cb3749948cc440c6d1a.scope - libcontainer container 3e75ac517230044829128789e9e8851ed88c435660e96cb3749948cc440c6d1a. Oct 9 02:46:36.443113 systemd[1]: Started cri-containerd-40aa026270c84da41df84c0f48061ca1ffd08b1f3168cf2004ec6687bd7ac72b.scope - libcontainer container 40aa026270c84da41df84c0f48061ca1ffd08b1f3168cf2004ec6687bd7ac72b. Oct 9 02:46:36.498354 containerd[1509]: time="2024-10-09T02:46:36.498317532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4116-0-0-c-ec98df32e3,Uid:429a8cc9f3efc1b4b9bded91c3069686,Namespace:kube-system,Attempt:0,} returns sandbox id \"71e35bd83a395861342f75f70606aec37f6e5b126c6f2ffcc920dfe3da078d67\"" Oct 9 02:46:36.505822 containerd[1509]: time="2024-10-09T02:46:36.505703581Z" level=info msg="CreateContainer within sandbox \"71e35bd83a395861342f75f70606aec37f6e5b126c6f2ffcc920dfe3da078d67\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Oct 9 02:46:36.508707 containerd[1509]: time="2024-10-09T02:46:36.508679347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4116-0-0-c-ec98df32e3,Uid:2551eb4063efef7f21b712c31e04aa59,Namespace:kube-system,Attempt:0,} returns sandbox id \"3e75ac517230044829128789e9e8851ed88c435660e96cb3749948cc440c6d1a\"" Oct 9 02:46:36.514398 containerd[1509]: time="2024-10-09T02:46:36.514358438Z" level=info msg="CreateContainer within sandbox \"3e75ac517230044829128789e9e8851ed88c435660e96cb3749948cc440c6d1a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Oct 9 02:46:36.523354 containerd[1509]: time="2024-10-09T02:46:36.523310985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4116-0-0-c-ec98df32e3,Uid:c057fdfa450ff52d1c7dd73b45942f05,Namespace:kube-system,Attempt:0,} returns sandbox id \"40aa026270c84da41df84c0f48061ca1ffd08b1f3168cf2004ec6687bd7ac72b\"" Oct 9 02:46:36.527646 containerd[1509]: time="2024-10-09T02:46:36.527501467Z" level=info msg="CreateContainer within sandbox \"40aa026270c84da41df84c0f48061ca1ffd08b1f3168cf2004ec6687bd7ac72b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Oct 9 02:46:36.535035 containerd[1509]: time="2024-10-09T02:46:36.534989390Z" level=info msg="CreateContainer within sandbox \"3e75ac517230044829128789e9e8851ed88c435660e96cb3749948cc440c6d1a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"776641fcfc935e523561c574a991459d79754c058d234a9b5ba4c0f3157c5b60\"" Oct 9 02:46:36.536580 containerd[1509]: time="2024-10-09T02:46:36.535762790Z" level=info msg="StartContainer for \"776641fcfc935e523561c574a991459d79754c058d234a9b5ba4c0f3157c5b60\"" Oct 9 02:46:36.547804 containerd[1509]: time="2024-10-09T02:46:36.547769093Z" level=info msg="CreateContainer within sandbox \"71e35bd83a395861342f75f70606aec37f6e5b126c6f2ffcc920dfe3da078d67\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8fb783c9798fb617c9b2af70f0e2b6ce9212a49c478728e68f5b83e904eca4ab\"" Oct 9 02:46:36.548471 containerd[1509]: time="2024-10-09T02:46:36.548388439Z" level=info msg="StartContainer for \"8fb783c9798fb617c9b2af70f0e2b6ce9212a49c478728e68f5b83e904eca4ab\"" Oct 9 02:46:36.558395 containerd[1509]: time="2024-10-09T02:46:36.558358413Z" level=info msg="CreateContainer within sandbox \"40aa026270c84da41df84c0f48061ca1ffd08b1f3168cf2004ec6687bd7ac72b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e3af7b68f48189bd56471e62f5fd83de9cb07dcc75dc4c2dd61840778e8db7c5\"" Oct 9 02:46:36.562045 containerd[1509]: time="2024-10-09T02:46:36.562020984Z" level=info msg="StartContainer for \"e3af7b68f48189bd56471e62f5fd83de9cb07dcc75dc4c2dd61840778e8db7c5\"" Oct 9 02:46:36.564585 systemd[1]: Started cri-containerd-776641fcfc935e523561c574a991459d79754c058d234a9b5ba4c0f3157c5b60.scope - libcontainer container 776641fcfc935e523561c574a991459d79754c058d234a9b5ba4c0f3157c5b60. Oct 9 02:46:36.585611 systemd[1]: Started cri-containerd-8fb783c9798fb617c9b2af70f0e2b6ce9212a49c478728e68f5b83e904eca4ab.scope - libcontainer container 8fb783c9798fb617c9b2af70f0e2b6ce9212a49c478728e68f5b83e904eca4ab. Oct 9 02:46:36.610559 systemd[1]: Started cri-containerd-e3af7b68f48189bd56471e62f5fd83de9cb07dcc75dc4c2dd61840778e8db7c5.scope - libcontainer container e3af7b68f48189bd56471e62f5fd83de9cb07dcc75dc4c2dd61840778e8db7c5. Oct 9 02:46:36.658729 kubelet[2518]: W1009 02:46:36.658493 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://188.245.48.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.658729 kubelet[2518]: E1009 02:46:36.658555 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://188.245.48.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.671794 containerd[1509]: time="2024-10-09T02:46:36.671540508Z" level=info msg="StartContainer for \"8fb783c9798fb617c9b2af70f0e2b6ce9212a49c478728e68f5b83e904eca4ab\" returns successfully" Oct 9 02:46:36.675932 containerd[1509]: time="2024-10-09T02:46:36.675647229Z" level=info msg="StartContainer for \"776641fcfc935e523561c574a991459d79754c058d234a9b5ba4c0f3157c5b60\" returns successfully" Oct 9 02:46:36.682940 containerd[1509]: time="2024-10-09T02:46:36.682880364Z" level=info msg="StartContainer for \"e3af7b68f48189bd56471e62f5fd83de9cb07dcc75dc4c2dd61840778e8db7c5\" returns successfully" Oct 9 02:46:36.721416 kubelet[2518]: E1009 02:46:36.721369 2518 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.48.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4116-0-0-c-ec98df32e3?timeout=10s\": dial tcp 188.245.48.63:6443: connect: connection refused" interval="1.6s" Oct 9 02:46:36.826403 kubelet[2518]: I1009 02:46:36.826289 2518 kubelet_node_status.go:73] "Attempting to register node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:36.826659 kubelet[2518]: E1009 02:46:36.826619 2518 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://188.245.48.63:6443/api/v1/nodes\": dial tcp 188.245.48.63:6443: connect: connection refused" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:36.835333 kubelet[2518]: W1009 02:46:36.835265 2518 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://188.245.48.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:36.835509 kubelet[2518]: E1009 02:46:36.835344 2518 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://188.245.48.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.48.63:6443: connect: connection refused Oct 9 02:46:38.325409 kubelet[2518]: E1009 02:46:38.325338 2518 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4116-0-0-c-ec98df32e3\" not found" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:38.429517 kubelet[2518]: I1009 02:46:38.429391 2518 kubelet_node_status.go:73] "Attempting to register node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:38.437183 kubelet[2518]: I1009 02:46:38.437119 2518 kubelet_node_status.go:76] "Successfully registered node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:39.295496 kubelet[2518]: I1009 02:46:39.295454 2518 apiserver.go:52] "Watching apiserver" Oct 9 02:46:39.317515 kubelet[2518]: I1009 02:46:39.317452 2518 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Oct 9 02:46:41.111086 systemd[1]: Reloading requested from client PID 2786 ('systemctl') (unit session-7.scope)... Oct 9 02:46:41.111103 systemd[1]: Reloading... Oct 9 02:46:41.207519 zram_generator::config[2829]: No configuration found. Oct 9 02:46:41.321413 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Oct 9 02:46:41.399192 systemd[1]: Reloading finished in 287 ms. Oct 9 02:46:41.441589 kubelet[2518]: I1009 02:46:41.441328 2518 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 9 02:46:41.441531 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:41.448968 systemd[1]: kubelet.service: Deactivated successfully. Oct 9 02:46:41.449195 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:41.455881 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 9 02:46:41.588702 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 9 02:46:41.598843 (kubelet)[2877]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 9 02:46:41.666839 kubelet[2877]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 9 02:46:41.666839 kubelet[2877]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 9 02:46:41.666839 kubelet[2877]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 9 02:46:41.667774 kubelet[2877]: I1009 02:46:41.667105 2877 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 9 02:46:41.673870 kubelet[2877]: I1009 02:46:41.673849 2877 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Oct 9 02:46:41.673957 kubelet[2877]: I1009 02:46:41.673946 2877 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 9 02:46:41.674314 kubelet[2877]: I1009 02:46:41.674269 2877 server.go:919] "Client rotation is on, will bootstrap in background" Oct 9 02:46:41.675835 kubelet[2877]: I1009 02:46:41.675822 2877 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 9 02:46:41.681642 kubelet[2877]: I1009 02:46:41.681621 2877 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 9 02:46:41.689550 kubelet[2877]: I1009 02:46:41.689533 2877 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 9 02:46:41.689887 kubelet[2877]: I1009 02:46:41.689875 2877 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 9 02:46:41.690111 kubelet[2877]: I1009 02:46:41.690092 2877 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Oct 9 02:46:41.691489 kubelet[2877]: I1009 02:46:41.691475 2877 topology_manager.go:138] "Creating topology manager with none policy" Oct 9 02:46:41.691628 kubelet[2877]: I1009 02:46:41.691550 2877 container_manager_linux.go:301] "Creating device plugin manager" Oct 9 02:46:41.692956 kubelet[2877]: I1009 02:46:41.692860 2877 state_mem.go:36] "Initialized new in-memory state store" Oct 9 02:46:41.693041 kubelet[2877]: I1009 02:46:41.693030 2877 kubelet.go:396] "Attempting to sync node with API server" Oct 9 02:46:41.693571 kubelet[2877]: I1009 02:46:41.693529 2877 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 9 02:46:41.693841 kubelet[2877]: I1009 02:46:41.693742 2877 kubelet.go:312] "Adding apiserver pod source" Oct 9 02:46:41.693841 kubelet[2877]: I1009 02:46:41.693763 2877 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 9 02:46:41.694770 kubelet[2877]: I1009 02:46:41.694714 2877 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.22" apiVersion="v1" Oct 9 02:46:41.694936 kubelet[2877]: I1009 02:46:41.694873 2877 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 9 02:46:41.695567 kubelet[2877]: I1009 02:46:41.695234 2877 server.go:1256] "Started kubelet" Oct 9 02:46:41.705945 kubelet[2877]: I1009 02:46:41.705026 2877 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 9 02:46:41.716675 kubelet[2877]: E1009 02:46:41.716562 2877 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 9 02:46:41.718921 kubelet[2877]: I1009 02:46:41.718609 2877 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Oct 9 02:46:41.719597 kubelet[2877]: I1009 02:46:41.718803 2877 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 9 02:46:41.719597 kubelet[2877]: I1009 02:46:41.719524 2877 volume_manager.go:291] "Starting Kubelet Volume Manager" Oct 9 02:46:41.722197 kubelet[2877]: I1009 02:46:41.721785 2877 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Oct 9 02:46:41.722197 kubelet[2877]: I1009 02:46:41.721928 2877 reconciler_new.go:29] "Reconciler: start to sync state" Oct 9 02:46:41.725869 kubelet[2877]: I1009 02:46:41.725238 2877 server.go:461] "Adding debug handlers to kubelet server" Oct 9 02:46:41.728249 kubelet[2877]: I1009 02:46:41.728224 2877 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 9 02:46:41.729504 kubelet[2877]: I1009 02:46:41.729351 2877 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 9 02:46:41.729504 kubelet[2877]: I1009 02:46:41.729385 2877 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 9 02:46:41.729504 kubelet[2877]: I1009 02:46:41.729403 2877 kubelet.go:2329] "Starting kubelet main sync loop" Oct 9 02:46:41.729504 kubelet[2877]: E1009 02:46:41.729493 2877 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 9 02:46:41.734127 kubelet[2877]: I1009 02:46:41.733521 2877 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 9 02:46:41.737835 kubelet[2877]: I1009 02:46:41.737819 2877 factory.go:221] Registration of the systemd container factory successfully Oct 9 02:46:41.738841 kubelet[2877]: I1009 02:46:41.738116 2877 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 9 02:46:41.745046 kubelet[2877]: I1009 02:46:41.743101 2877 factory.go:221] Registration of the containerd container factory successfully Oct 9 02:46:41.802582 kubelet[2877]: I1009 02:46:41.802538 2877 cpu_manager.go:214] "Starting CPU manager" policy="none" Oct 9 02:46:41.802582 kubelet[2877]: I1009 02:46:41.802564 2877 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Oct 9 02:46:41.802582 kubelet[2877]: I1009 02:46:41.802580 2877 state_mem.go:36] "Initialized new in-memory state store" Oct 9 02:46:41.802912 kubelet[2877]: I1009 02:46:41.802722 2877 state_mem.go:88] "Updated default CPUSet" cpuSet="" Oct 9 02:46:41.802912 kubelet[2877]: I1009 02:46:41.802764 2877 state_mem.go:96] "Updated CPUSet assignments" assignments={} Oct 9 02:46:41.802912 kubelet[2877]: I1009 02:46:41.802774 2877 policy_none.go:49] "None policy: Start" Oct 9 02:46:41.803997 kubelet[2877]: I1009 02:46:41.803515 2877 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 9 02:46:41.803997 kubelet[2877]: I1009 02:46:41.803538 2877 state_mem.go:35] "Initializing new in-memory state store" Oct 9 02:46:41.803997 kubelet[2877]: I1009 02:46:41.803700 2877 state_mem.go:75] "Updated machine memory state" Oct 9 02:46:41.811990 kubelet[2877]: I1009 02:46:41.811958 2877 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 9 02:46:41.814868 kubelet[2877]: I1009 02:46:41.814054 2877 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 9 02:46:41.824816 kubelet[2877]: I1009 02:46:41.824606 2877 kubelet_node_status.go:73] "Attempting to register node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.830978 kubelet[2877]: I1009 02:46:41.830515 2877 topology_manager.go:215] "Topology Admit Handler" podUID="429a8cc9f3efc1b4b9bded91c3069686" podNamespace="kube-system" podName="kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.830978 kubelet[2877]: I1009 02:46:41.830588 2877 topology_manager.go:215] "Topology Admit Handler" podUID="2551eb4063efef7f21b712c31e04aa59" podNamespace="kube-system" podName="kube-scheduler-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.830978 kubelet[2877]: I1009 02:46:41.830618 2877 topology_manager.go:215] "Topology Admit Handler" podUID="c057fdfa450ff52d1c7dd73b45942f05" podNamespace="kube-system" podName="kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.846415 kubelet[2877]: E1009 02:46:41.845510 2877 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" already exists" pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.852931 kubelet[2877]: I1009 02:46:41.852898 2877 kubelet_node_status.go:112] "Node was previously registered" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.853041 kubelet[2877]: I1009 02:46:41.852974 2877 kubelet_node_status.go:76] "Successfully registered node" node="ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.924191 kubelet[2877]: I1009 02:46:41.924037 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-ca-certs\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.924191 kubelet[2877]: I1009 02:46:41.924178 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-flexvolume-dir\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.924385 kubelet[2877]: I1009 02:46:41.924203 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-k8s-certs\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.925995 kubelet[2877]: I1009 02:46:41.925348 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-kubeconfig\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.925995 kubelet[2877]: I1009 02:46:41.925483 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/429a8cc9f3efc1b4b9bded91c3069686-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4116-0-0-c-ec98df32e3\" (UID: \"429a8cc9f3efc1b4b9bded91c3069686\") " pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.925995 kubelet[2877]: I1009 02:46:41.925523 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2551eb4063efef7f21b712c31e04aa59-kubeconfig\") pod \"kube-scheduler-ci-4116-0-0-c-ec98df32e3\" (UID: \"2551eb4063efef7f21b712c31e04aa59\") " pod="kube-system/kube-scheduler-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.925995 kubelet[2877]: I1009 02:46:41.925557 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c057fdfa450ff52d1c7dd73b45942f05-ca-certs\") pod \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" (UID: \"c057fdfa450ff52d1c7dd73b45942f05\") " pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.925995 kubelet[2877]: I1009 02:46:41.925586 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c057fdfa450ff52d1c7dd73b45942f05-k8s-certs\") pod \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" (UID: \"c057fdfa450ff52d1c7dd73b45942f05\") " pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:41.926225 kubelet[2877]: I1009 02:46:41.925614 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c057fdfa450ff52d1c7dd73b45942f05-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4116-0-0-c-ec98df32e3\" (UID: \"c057fdfa450ff52d1c7dd73b45942f05\") " pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" Oct 9 02:46:42.099516 update_engine[1497]: I20241009 02:46:42.099440 1497 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Oct 9 02:46:42.099931 update_engine[1497]: I20241009 02:46:42.099668 1497 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Oct 9 02:46:42.099931 update_engine[1497]: I20241009 02:46:42.099845 1497 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Oct 9 02:46:42.100648 update_engine[1497]: E20241009 02:46:42.100554 1497 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Oct 9 02:46:42.100648 update_engine[1497]: I20241009 02:46:42.100625 1497 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Oct 9 02:46:42.708600 kubelet[2877]: I1009 02:46:42.708562 2877 apiserver.go:52] "Watching apiserver" Oct 9 02:46:42.722261 kubelet[2877]: I1009 02:46:42.722196 2877 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Oct 9 02:46:42.935255 kubelet[2877]: I1009 02:46:42.935216 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4116-0-0-c-ec98df32e3" podStartSLOduration=3.935175798 podStartE2EDuration="3.935175798s" podCreationTimestamp="2024-10-09 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:46:42.878725576 +0000 UTC m=+1.269976022" watchObservedRunningTime="2024-10-09 02:46:42.935175798 +0000 UTC m=+1.326426243" Oct 9 02:46:42.935457 kubelet[2877]: I1009 02:46:42.935327 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4116-0-0-c-ec98df32e3" podStartSLOduration=1.935291087 podStartE2EDuration="1.935291087s" podCreationTimestamp="2024-10-09 02:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:46:42.93333836 +0000 UTC m=+1.324588805" watchObservedRunningTime="2024-10-09 02:46:42.935291087 +0000 UTC m=+1.326541532" Oct 9 02:46:46.276321 sudo[1919]: pam_unix(sudo:session): session closed for user root Oct 9 02:46:46.439122 sshd[1916]: pam_unix(sshd:session): session closed for user core Oct 9 02:46:46.442974 systemd[1]: sshd@6-188.245.48.63:22-139.178.68.195:46006.service: Deactivated successfully. Oct 9 02:46:46.445126 systemd[1]: session-7.scope: Deactivated successfully. Oct 9 02:46:46.445495 systemd[1]: session-7.scope: Consumed 3.946s CPU time, 182.8M memory peak, 0B memory swap peak. Oct 9 02:46:46.447858 systemd-logind[1493]: Session 7 logged out. Waiting for processes to exit. Oct 9 02:46:46.449269 systemd-logind[1493]: Removed session 7. Oct 9 02:46:46.580161 kubelet[2877]: I1009 02:46:46.580017 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4116-0-0-c-ec98df32e3" podStartSLOduration=5.579948543 podStartE2EDuration="5.579948543s" podCreationTimestamp="2024-10-09 02:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:46:42.95856922 +0000 UTC m=+1.349819665" watchObservedRunningTime="2024-10-09 02:46:46.579948543 +0000 UTC m=+4.971198998" Oct 9 02:46:52.104535 update_engine[1497]: I20241009 02:46:52.104013 1497 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Oct 9 02:46:52.104535 update_engine[1497]: I20241009 02:46:52.104321 1497 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Oct 9 02:46:52.105468 update_engine[1497]: I20241009 02:46:52.105374 1497 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Oct 9 02:46:52.106667 update_engine[1497]: E20241009 02:46:52.105897 1497 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.105950 1497 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.105962 1497 omaha_request_action.cc:617] Omaha request response: Oct 9 02:46:52.106667 update_engine[1497]: E20241009 02:46:52.106045 1497 omaha_request_action.cc:636] Omaha request network transfer failed. Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106073 1497 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106081 1497 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106089 1497 update_attempter.cc:306] Processing Done. Oct 9 02:46:52.106667 update_engine[1497]: E20241009 02:46:52.106106 1497 update_attempter.cc:619] Update failed. Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106114 1497 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106122 1497 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106129 1497 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106211 1497 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106235 1497 omaha_request_action.cc:271] Posting an Omaha request to disabled Oct 9 02:46:52.106667 update_engine[1497]: I20241009 02:46:52.106243 1497 omaha_request_action.cc:272] Request: Oct 9 02:46:52.106667 update_engine[1497]: Oct 9 02:46:52.106667 update_engine[1497]: Oct 9 02:46:52.106667 update_engine[1497]: Oct 9 02:46:52.107116 update_engine[1497]: Oct 9 02:46:52.107116 update_engine[1497]: Oct 9 02:46:52.107116 update_engine[1497]: Oct 9 02:46:52.107116 update_engine[1497]: I20241009 02:46:52.106251 1497 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Oct 9 02:46:52.107116 update_engine[1497]: I20241009 02:46:52.106467 1497 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Oct 9 02:46:52.107116 update_engine[1497]: I20241009 02:46:52.106614 1497 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Oct 9 02:46:52.107263 locksmithd[1532]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Oct 9 02:46:52.107848 update_engine[1497]: E20241009 02:46:52.107806 1497 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107858 1497 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107869 1497 omaha_request_action.cc:617] Omaha request response: Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107878 1497 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107884 1497 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107892 1497 update_attempter.cc:306] Processing Done. Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107899 1497 update_attempter.cc:310] Error event sent. Oct 9 02:46:52.107912 update_engine[1497]: I20241009 02:46:52.107909 1497 update_check_scheduler.cc:74] Next update check in 44m54s Oct 9 02:46:52.108191 locksmithd[1532]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Oct 9 02:46:54.272758 kubelet[2877]: I1009 02:46:54.272700 2877 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Oct 9 02:46:54.273165 containerd[1509]: time="2024-10-09T02:46:54.273009282Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Oct 9 02:46:54.273387 kubelet[2877]: I1009 02:46:54.273304 2877 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Oct 9 02:46:54.919948 kubelet[2877]: I1009 02:46:54.919868 2877 topology_manager.go:215] "Topology Admit Handler" podUID="a806e5db-8369-4d2c-887e-4d1c11e263dd" podNamespace="kube-system" podName="kube-proxy-sjrs8" Oct 9 02:46:54.938471 systemd[1]: Created slice kubepods-besteffort-poda806e5db_8369_4d2c_887e_4d1c11e263dd.slice - libcontainer container kubepods-besteffort-poda806e5db_8369_4d2c_887e_4d1c11e263dd.slice. Oct 9 02:46:55.020077 kubelet[2877]: I1009 02:46:55.019719 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a806e5db-8369-4d2c-887e-4d1c11e263dd-kube-proxy\") pod \"kube-proxy-sjrs8\" (UID: \"a806e5db-8369-4d2c-887e-4d1c11e263dd\") " pod="kube-system/kube-proxy-sjrs8" Oct 9 02:46:55.020077 kubelet[2877]: I1009 02:46:55.019780 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4qx\" (UniqueName: \"kubernetes.io/projected/a806e5db-8369-4d2c-887e-4d1c11e263dd-kube-api-access-tk4qx\") pod \"kube-proxy-sjrs8\" (UID: \"a806e5db-8369-4d2c-887e-4d1c11e263dd\") " pod="kube-system/kube-proxy-sjrs8" Oct 9 02:46:55.020077 kubelet[2877]: I1009 02:46:55.019808 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a806e5db-8369-4d2c-887e-4d1c11e263dd-lib-modules\") pod \"kube-proxy-sjrs8\" (UID: \"a806e5db-8369-4d2c-887e-4d1c11e263dd\") " pod="kube-system/kube-proxy-sjrs8" Oct 9 02:46:55.020077 kubelet[2877]: I1009 02:46:55.019843 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a806e5db-8369-4d2c-887e-4d1c11e263dd-xtables-lock\") pod \"kube-proxy-sjrs8\" (UID: \"a806e5db-8369-4d2c-887e-4d1c11e263dd\") " pod="kube-system/kube-proxy-sjrs8" Oct 9 02:46:55.248920 containerd[1509]: time="2024-10-09T02:46:55.248798099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sjrs8,Uid:a806e5db-8369-4d2c-887e-4d1c11e263dd,Namespace:kube-system,Attempt:0,}" Oct 9 02:46:55.288450 containerd[1509]: time="2024-10-09T02:46:55.287975396Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:46:55.288450 containerd[1509]: time="2024-10-09T02:46:55.288123357Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:46:55.289596 containerd[1509]: time="2024-10-09T02:46:55.289164506Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:55.290144 containerd[1509]: time="2024-10-09T02:46:55.289783271Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:55.320910 kubelet[2877]: I1009 02:46:55.320844 2877 topology_manager.go:215] "Topology Admit Handler" podUID="f861cb6e-642d-48b2-8e12-b925bff0c193" podNamespace="tigera-operator" podName="tigera-operator-5d56685c77-pdg4d" Oct 9 02:46:55.325422 kubelet[2877]: W1009 02:46:55.325322 2877 reflector.go:539] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4116-0-0-c-ec98df32e3" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4116-0-0-c-ec98df32e3' and this object Oct 9 02:46:55.325422 kubelet[2877]: E1009 02:46:55.325350 2877 reflector.go:147] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4116-0-0-c-ec98df32e3" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4116-0-0-c-ec98df32e3' and this object Oct 9 02:46:55.337669 systemd[1]: Started cri-containerd-9db9c97f699c5fcea984105e5d4d1c1268f45e8d26e87ed80aaa81733371b37e.scope - libcontainer container 9db9c97f699c5fcea984105e5d4d1c1268f45e8d26e87ed80aaa81733371b37e. Oct 9 02:46:55.341752 systemd[1]: Created slice kubepods-besteffort-podf861cb6e_642d_48b2_8e12_b925bff0c193.slice - libcontainer container kubepods-besteffort-podf861cb6e_642d_48b2_8e12_b925bff0c193.slice. Oct 9 02:46:55.364403 containerd[1509]: time="2024-10-09T02:46:55.364363976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sjrs8,Uid:a806e5db-8369-4d2c-887e-4d1c11e263dd,Namespace:kube-system,Attempt:0,} returns sandbox id \"9db9c97f699c5fcea984105e5d4d1c1268f45e8d26e87ed80aaa81733371b37e\"" Oct 9 02:46:55.371519 containerd[1509]: time="2024-10-09T02:46:55.371372633Z" level=info msg="CreateContainer within sandbox \"9db9c97f699c5fcea984105e5d4d1c1268f45e8d26e87ed80aaa81733371b37e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Oct 9 02:46:55.387734 containerd[1509]: time="2024-10-09T02:46:55.387630594Z" level=info msg="CreateContainer within sandbox \"9db9c97f699c5fcea984105e5d4d1c1268f45e8d26e87ed80aaa81733371b37e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b19b662772477d5fec7e4413abfeaaed4cad6eca0ede6fe98ba91e4486798f6b\"" Oct 9 02:46:55.388516 containerd[1509]: time="2024-10-09T02:46:55.388371191Z" level=info msg="StartContainer for \"b19b662772477d5fec7e4413abfeaaed4cad6eca0ede6fe98ba91e4486798f6b\"" Oct 9 02:46:55.417577 systemd[1]: Started cri-containerd-b19b662772477d5fec7e4413abfeaaed4cad6eca0ede6fe98ba91e4486798f6b.scope - libcontainer container b19b662772477d5fec7e4413abfeaaed4cad6eca0ede6fe98ba91e4486798f6b. Oct 9 02:46:55.424716 kubelet[2877]: I1009 02:46:55.423771 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f861cb6e-642d-48b2-8e12-b925bff0c193-var-lib-calico\") pod \"tigera-operator-5d56685c77-pdg4d\" (UID: \"f861cb6e-642d-48b2-8e12-b925bff0c193\") " pod="tigera-operator/tigera-operator-5d56685c77-pdg4d" Oct 9 02:46:55.424716 kubelet[2877]: I1009 02:46:55.424545 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtm2r\" (UniqueName: \"kubernetes.io/projected/f861cb6e-642d-48b2-8e12-b925bff0c193-kube-api-access-rtm2r\") pod \"tigera-operator-5d56685c77-pdg4d\" (UID: \"f861cb6e-642d-48b2-8e12-b925bff0c193\") " pod="tigera-operator/tigera-operator-5d56685c77-pdg4d" Oct 9 02:46:55.450020 containerd[1509]: time="2024-10-09T02:46:55.449984678Z" level=info msg="StartContainer for \"b19b662772477d5fec7e4413abfeaaed4cad6eca0ede6fe98ba91e4486798f6b\" returns successfully" Oct 9 02:46:55.645308 containerd[1509]: time="2024-10-09T02:46:55.645191556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5d56685c77-pdg4d,Uid:f861cb6e-642d-48b2-8e12-b925bff0c193,Namespace:tigera-operator,Attempt:0,}" Oct 9 02:46:55.681752 containerd[1509]: time="2024-10-09T02:46:55.681655399Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:46:55.681752 containerd[1509]: time="2024-10-09T02:46:55.681712307Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:46:55.681752 containerd[1509]: time="2024-10-09T02:46:55.681725993Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:55.682185 containerd[1509]: time="2024-10-09T02:46:55.681794353Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:46:55.703874 systemd[1]: Started cri-containerd-5e953a14ebac0d353fded9fa29b50a587761ca25a156131d8743cc336a839173.scope - libcontainer container 5e953a14ebac0d353fded9fa29b50a587761ca25a156131d8743cc336a839173. Oct 9 02:46:55.760969 containerd[1509]: time="2024-10-09T02:46:55.760879036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5d56685c77-pdg4d,Uid:f861cb6e-642d-48b2-8e12-b925bff0c193,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5e953a14ebac0d353fded9fa29b50a587761ca25a156131d8743cc336a839173\"" Oct 9 02:46:55.762392 containerd[1509]: time="2024-10-09T02:46:55.762369177Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.34.3\"" Oct 9 02:46:57.104344 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3054617566.mount: Deactivated successfully. Oct 9 02:46:57.476055 containerd[1509]: time="2024-10-09T02:46:57.475900105Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:57.477173 containerd[1509]: time="2024-10-09T02:46:57.477115122Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.34.3: active requests=0, bytes read=22136525" Oct 9 02:46:57.478103 containerd[1509]: time="2024-10-09T02:46:57.478040400Z" level=info msg="ImageCreate event name:\"sha256:d4e6e064c25d51e66b2470e80d7b57004f79e2a76b37e83986577f8666da9736\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:57.480349 containerd[1509]: time="2024-10-09T02:46:57.480325309Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:2cc4de6ad019ccc3abbd2615c159d0dcfb2ecdab90dc5805f08837d7c014d458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:46:57.481134 containerd[1509]: time="2024-10-09T02:46:57.480994851Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.34.3\" with image id \"sha256:d4e6e064c25d51e66b2470e80d7b57004f79e2a76b37e83986577f8666da9736\", repo tag \"quay.io/tigera/operator:v1.34.3\", repo digest \"quay.io/tigera/operator@sha256:2cc4de6ad019ccc3abbd2615c159d0dcfb2ecdab90dc5805f08837d7c014d458\", size \"22130728\" in 1.718596697s" Oct 9 02:46:57.481134 containerd[1509]: time="2024-10-09T02:46:57.481028344Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.34.3\" returns image reference \"sha256:d4e6e064c25d51e66b2470e80d7b57004f79e2a76b37e83986577f8666da9736\"" Oct 9 02:46:57.484512 containerd[1509]: time="2024-10-09T02:46:57.484394727Z" level=info msg="CreateContainer within sandbox \"5e953a14ebac0d353fded9fa29b50a587761ca25a156131d8743cc336a839173\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Oct 9 02:46:57.501697 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3370303854.mount: Deactivated successfully. Oct 9 02:46:57.504417 containerd[1509]: time="2024-10-09T02:46:57.504214584Z" level=info msg="CreateContainer within sandbox \"5e953a14ebac0d353fded9fa29b50a587761ca25a156131d8743cc336a839173\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6ac9a23eda122d6470f458c5e7187d7f4bdad9cb6750585c0d12376d9b6bf3fa\"" Oct 9 02:46:57.504798 containerd[1509]: time="2024-10-09T02:46:57.504777633Z" level=info msg="StartContainer for \"6ac9a23eda122d6470f458c5e7187d7f4bdad9cb6750585c0d12376d9b6bf3fa\"" Oct 9 02:46:57.538655 systemd[1]: Started cri-containerd-6ac9a23eda122d6470f458c5e7187d7f4bdad9cb6750585c0d12376d9b6bf3fa.scope - libcontainer container 6ac9a23eda122d6470f458c5e7187d7f4bdad9cb6750585c0d12376d9b6bf3fa. Oct 9 02:46:57.567190 containerd[1509]: time="2024-10-09T02:46:57.567143805Z" level=info msg="StartContainer for \"6ac9a23eda122d6470f458c5e7187d7f4bdad9cb6750585c0d12376d9b6bf3fa\" returns successfully" Oct 9 02:46:57.820298 kubelet[2877]: I1009 02:46:57.820254 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-sjrs8" podStartSLOduration=3.818931373 podStartE2EDuration="3.818931373s" podCreationTimestamp="2024-10-09 02:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:46:55.819163828 +0000 UTC m=+14.210414273" watchObservedRunningTime="2024-10-09 02:46:57.818931373 +0000 UTC m=+16.210181828" Oct 9 02:46:57.821970 kubelet[2877]: I1009 02:46:57.820642 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5d56685c77-pdg4d" podStartSLOduration=1.100897113 podStartE2EDuration="2.820603889s" podCreationTimestamp="2024-10-09 02:46:55 +0000 UTC" firstStartedPulling="2024-10-09 02:46:55.761832678 +0000 UTC m=+14.153083123" lastFinishedPulling="2024-10-09 02:46:57.481539454 +0000 UTC m=+15.872789899" observedRunningTime="2024-10-09 02:46:57.820280033 +0000 UTC m=+16.211530488" watchObservedRunningTime="2024-10-09 02:46:57.820603889 +0000 UTC m=+16.211854334" Oct 9 02:47:00.423348 kubelet[2877]: I1009 02:47:00.423310 2877 topology_manager.go:215] "Topology Admit Handler" podUID="18b9a564-1cd6-4adc-a356-26372ffdcfff" podNamespace="calico-system" podName="calico-typha-5b59df6465-xh92c" Oct 9 02:47:00.434306 systemd[1]: Created slice kubepods-besteffort-pod18b9a564_1cd6_4adc_a356_26372ffdcfff.slice - libcontainer container kubepods-besteffort-pod18b9a564_1cd6_4adc_a356_26372ffdcfff.slice. Oct 9 02:47:00.467018 kubelet[2877]: I1009 02:47:00.466984 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/18b9a564-1cd6-4adc-a356-26372ffdcfff-typha-certs\") pod \"calico-typha-5b59df6465-xh92c\" (UID: \"18b9a564-1cd6-4adc-a356-26372ffdcfff\") " pod="calico-system/calico-typha-5b59df6465-xh92c" Oct 9 02:47:00.467018 kubelet[2877]: I1009 02:47:00.467029 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbdt\" (UniqueName: \"kubernetes.io/projected/18b9a564-1cd6-4adc-a356-26372ffdcfff-kube-api-access-kkbdt\") pod \"calico-typha-5b59df6465-xh92c\" (UID: \"18b9a564-1cd6-4adc-a356-26372ffdcfff\") " pod="calico-system/calico-typha-5b59df6465-xh92c" Oct 9 02:47:00.467172 kubelet[2877]: I1009 02:47:00.467057 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18b9a564-1cd6-4adc-a356-26372ffdcfff-tigera-ca-bundle\") pod \"calico-typha-5b59df6465-xh92c\" (UID: \"18b9a564-1cd6-4adc-a356-26372ffdcfff\") " pod="calico-system/calico-typha-5b59df6465-xh92c" Oct 9 02:47:00.471650 kubelet[2877]: I1009 02:47:00.470735 2877 topology_manager.go:215] "Topology Admit Handler" podUID="ad94d7f1-a5ec-43ac-936a-96d757e957c5" podNamespace="calico-system" podName="calico-node-89x8m" Oct 9 02:47:00.479008 systemd[1]: Created slice kubepods-besteffort-podad94d7f1_a5ec_43ac_936a_96d757e957c5.slice - libcontainer container kubepods-besteffort-podad94d7f1_a5ec_43ac_936a_96d757e957c5.slice. Oct 9 02:47:00.568200 kubelet[2877]: I1009 02:47:00.568173 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-net-dir\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.568381 kubelet[2877]: I1009 02:47:00.568369 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-log-dir\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.568494 kubelet[2877]: I1009 02:47:00.568481 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ad94d7f1-a5ec-43ac-936a-96d757e957c5-node-certs\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.569174 kubelet[2877]: I1009 02:47:00.569162 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad94d7f1-a5ec-43ac-936a-96d757e957c5-tigera-ca-bundle\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.569405 kubelet[2877]: I1009 02:47:00.569250 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-run-calico\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.569405 kubelet[2877]: I1009 02:47:00.569344 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-lib-modules\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.569405 kubelet[2877]: I1009 02:47:00.569363 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-xtables-lock\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.570089 kubelet[2877]: I1009 02:47:00.569553 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-policysync\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.570089 kubelet[2877]: I1009 02:47:00.569582 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-lib-calico\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.570089 kubelet[2877]: I1009 02:47:00.569631 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2hq\" (UniqueName: \"kubernetes.io/projected/ad94d7f1-a5ec-43ac-936a-96d757e957c5-kube-api-access-4z2hq\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.570089 kubelet[2877]: I1009 02:47:00.569651 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-bin-dir\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.570089 kubelet[2877]: I1009 02:47:00.569668 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-flexvol-driver-host\") pod \"calico-node-89x8m\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " pod="calico-system/calico-node-89x8m" Oct 9 02:47:00.596833 kubelet[2877]: I1009 02:47:00.596798 2877 topology_manager.go:215] "Topology Admit Handler" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" podNamespace="calico-system" podName="csi-node-driver-2g2jv" Oct 9 02:47:00.597070 kubelet[2877]: E1009 02:47:00.597046 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:00.670973 kubelet[2877]: I1009 02:47:00.670596 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0cdabbef-c118-48a3-a126-6564dda90df7-varrun\") pod \"csi-node-driver-2g2jv\" (UID: \"0cdabbef-c118-48a3-a126-6564dda90df7\") " pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:00.671195 kubelet[2877]: I1009 02:47:00.670934 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0cdabbef-c118-48a3-a126-6564dda90df7-registration-dir\") pod \"csi-node-driver-2g2jv\" (UID: \"0cdabbef-c118-48a3-a126-6564dda90df7\") " pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:00.671544 kubelet[2877]: I1009 02:47:00.671421 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmfj\" (UniqueName: \"kubernetes.io/projected/0cdabbef-c118-48a3-a126-6564dda90df7-kube-api-access-9qmfj\") pod \"csi-node-driver-2g2jv\" (UID: \"0cdabbef-c118-48a3-a126-6564dda90df7\") " pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:00.671787 kubelet[2877]: E1009 02:47:00.671774 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.671926 kubelet[2877]: W1009 02:47:00.671851 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.671926 kubelet[2877]: E1009 02:47:00.671881 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.672394 kubelet[2877]: E1009 02:47:00.672351 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.672394 kubelet[2877]: W1009 02:47:00.672361 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.672823 kubelet[2877]: E1009 02:47:00.672718 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.672823 kubelet[2877]: W1009 02:47:00.672729 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.672823 kubelet[2877]: E1009 02:47:00.672740 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.672970 kubelet[2877]: E1009 02:47:00.672937 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.673227 kubelet[2877]: E1009 02:47:00.673195 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.673227 kubelet[2877]: W1009 02:47:00.673206 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.673619 kubelet[2877]: E1009 02:47:00.673492 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.673948 kubelet[2877]: E1009 02:47:00.673937 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.674135 kubelet[2877]: W1009 02:47:00.674003 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.674135 kubelet[2877]: E1009 02:47:00.674019 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.674471 kubelet[2877]: E1009 02:47:00.674409 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.674620 kubelet[2877]: W1009 02:47:00.674545 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.674620 kubelet[2877]: E1009 02:47:00.674568 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.674620 kubelet[2877]: I1009 02:47:00.674586 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0cdabbef-c118-48a3-a126-6564dda90df7-socket-dir\") pod \"csi-node-driver-2g2jv\" (UID: \"0cdabbef-c118-48a3-a126-6564dda90df7\") " pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:00.675120 kubelet[2877]: E1009 02:47:00.674999 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.675120 kubelet[2877]: W1009 02:47:00.675008 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.675225 kubelet[2877]: E1009 02:47:00.675213 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.675492 kubelet[2877]: E1009 02:47:00.675475 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.675611 kubelet[2877]: W1009 02:47:00.675547 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.675790 kubelet[2877]: E1009 02:47:00.675771 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.675981 kubelet[2877]: E1009 02:47:00.675971 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.676068 kubelet[2877]: W1009 02:47:00.676049 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.676194 kubelet[2877]: E1009 02:47:00.676140 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.676539 kubelet[2877]: E1009 02:47:00.676473 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.676539 kubelet[2877]: W1009 02:47:00.676483 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.676765 kubelet[2877]: E1009 02:47:00.676675 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.676915 kubelet[2877]: E1009 02:47:00.676898 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.677024 kubelet[2877]: W1009 02:47:00.676959 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.677097 kubelet[2877]: E1009 02:47:00.677070 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.677472 kubelet[2877]: E1009 02:47:00.677348 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.677472 kubelet[2877]: W1009 02:47:00.677357 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.677576 kubelet[2877]: E1009 02:47:00.677564 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.677727 kubelet[2877]: E1009 02:47:00.677683 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.677727 kubelet[2877]: W1009 02:47:00.677692 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.677855 kubelet[2877]: E1009 02:47:00.677796 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.678128 kubelet[2877]: E1009 02:47:00.678056 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.678128 kubelet[2877]: W1009 02:47:00.678065 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.678224 kubelet[2877]: E1009 02:47:00.678206 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.678484 kubelet[2877]: E1009 02:47:00.678377 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.678484 kubelet[2877]: W1009 02:47:00.678386 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.678598 kubelet[2877]: E1009 02:47:00.678586 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.678717 kubelet[2877]: E1009 02:47:00.678707 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.678788 kubelet[2877]: W1009 02:47:00.678764 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.678905 kubelet[2877]: E1009 02:47:00.678854 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.679090 kubelet[2877]: E1009 02:47:00.679080 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.679174 kubelet[2877]: W1009 02:47:00.679134 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.679277 kubelet[2877]: E1009 02:47:00.679222 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.679409 kubelet[2877]: E1009 02:47:00.679398 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.679631 kubelet[2877]: W1009 02:47:00.679488 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.679631 kubelet[2877]: E1009 02:47:00.679574 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.680026 kubelet[2877]: E1009 02:47:00.679924 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.680026 kubelet[2877]: W1009 02:47:00.679934 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.680125 kubelet[2877]: E1009 02:47:00.680104 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.680205 kubelet[2877]: E1009 02:47:00.680185 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.680205 kubelet[2877]: W1009 02:47:00.680193 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.680379 kubelet[2877]: E1009 02:47:00.680362 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.680558 kubelet[2877]: E1009 02:47:00.680472 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.680558 kubelet[2877]: W1009 02:47:00.680479 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.680664 kubelet[2877]: E1009 02:47:00.680654 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.680853 kubelet[2877]: E1009 02:47:00.680843 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.680929 kubelet[2877]: W1009 02:47:00.680906 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.681065 kubelet[2877]: E1009 02:47:00.681013 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.681252 kubelet[2877]: E1009 02:47:00.681150 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.681252 kubelet[2877]: W1009 02:47:00.681159 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.681252 kubelet[2877]: E1009 02:47:00.681172 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.681375 kubelet[2877]: E1009 02:47:00.681365 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.681547 kubelet[2877]: W1009 02:47:00.681463 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.681547 kubelet[2877]: E1009 02:47:00.681489 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.682023 kubelet[2877]: E1009 02:47:00.681899 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.682023 kubelet[2877]: W1009 02:47:00.681909 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.682023 kubelet[2877]: E1009 02:47:00.681920 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.682023 kubelet[2877]: I1009 02:47:00.681935 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0cdabbef-c118-48a3-a126-6564dda90df7-kubelet-dir\") pod \"csi-node-driver-2g2jv\" (UID: \"0cdabbef-c118-48a3-a126-6564dda90df7\") " pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:00.682244 kubelet[2877]: E1009 02:47:00.682232 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.682374 kubelet[2877]: W1009 02:47:00.682289 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.682374 kubelet[2877]: E1009 02:47:00.682303 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.682626 kubelet[2877]: E1009 02:47:00.682615 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.682787 kubelet[2877]: W1009 02:47:00.682676 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.682787 kubelet[2877]: E1009 02:47:00.682703 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.682954 kubelet[2877]: E1009 02:47:00.682943 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.683072 kubelet[2877]: W1009 02:47:00.682991 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.683072 kubelet[2877]: E1009 02:47:00.683004 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.683222 kubelet[2877]: E1009 02:47:00.683212 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.683352 kubelet[2877]: W1009 02:47:00.683275 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.683352 kubelet[2877]: E1009 02:47:00.683288 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.683609 kubelet[2877]: E1009 02:47:00.683597 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.683810 kubelet[2877]: W1009 02:47:00.683695 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.683810 kubelet[2877]: E1009 02:47:00.683726 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.684010 kubelet[2877]: E1009 02:47:00.683999 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.684064 kubelet[2877]: W1009 02:47:00.684054 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.684167 kubelet[2877]: E1009 02:47:00.684104 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.684330 kubelet[2877]: E1009 02:47:00.684320 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.684481 kubelet[2877]: W1009 02:47:00.684384 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.684481 kubelet[2877]: E1009 02:47:00.684396 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.684653 kubelet[2877]: E1009 02:47:00.684642 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.684733 kubelet[2877]: W1009 02:47:00.684723 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.684944 kubelet[2877]: E1009 02:47:00.684870 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.685084 kubelet[2877]: E1009 02:47:00.685058 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.685162 kubelet[2877]: W1009 02:47:00.685130 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.685383 kubelet[2877]: E1009 02:47:00.685344 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.685531 kubelet[2877]: E1009 02:47:00.685485 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.685531 kubelet[2877]: W1009 02:47:00.685495 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.685658 kubelet[2877]: E1009 02:47:00.685609 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.685921 kubelet[2877]: E1009 02:47:00.685871 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.685921 kubelet[2877]: W1009 02:47:00.685880 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.686029 kubelet[2877]: E1009 02:47:00.685993 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.689858 kubelet[2877]: E1009 02:47:00.688556 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.689858 kubelet[2877]: W1009 02:47:00.688567 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.689858 kubelet[2877]: E1009 02:47:00.689556 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.689858 kubelet[2877]: W1009 02:47:00.689565 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.689858 kubelet[2877]: E1009 02:47:00.689712 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.689858 kubelet[2877]: W1009 02:47:00.689719 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.690027 kubelet[2877]: E1009 02:47:00.689878 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.690027 kubelet[2877]: W1009 02:47:00.689885 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690178 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690187 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690197 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.690464 kubelet[2877]: W1009 02:47:00.690188 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690205 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690237 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690350 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.690464 kubelet[2877]: E1009 02:47:00.690356 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.690464 kubelet[2877]: W1009 02:47:00.690357 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.692243 kubelet[2877]: E1009 02:47:00.690513 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.692243 kubelet[2877]: W1009 02:47:00.690520 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.692243 kubelet[2877]: E1009 02:47:00.691025 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.692243 kubelet[2877]: W1009 02:47:00.691032 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.692243 kubelet[2877]: E1009 02:47:00.691169 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.692243 kubelet[2877]: W1009 02:47:00.691175 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.694334 kubelet[2877]: E1009 02:47:00.693697 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.694334 kubelet[2877]: W1009 02:47:00.693711 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.694334 kubelet[2877]: E1009 02:47:00.693722 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.694334 kubelet[2877]: E1009 02:47:00.693782 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.694807 kubelet[2877]: E1009 02:47:00.694725 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.694807 kubelet[2877]: W1009 02:47:00.694736 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.694807 kubelet[2877]: E1009 02:47:00.694746 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695157 kubelet[2877]: E1009 02:47:00.694954 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.695157 kubelet[2877]: W1009 02:47:00.694966 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.695157 kubelet[2877]: E1009 02:47:00.694977 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695157 kubelet[2877]: E1009 02:47:00.695130 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.695157 kubelet[2877]: W1009 02:47:00.695137 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.695157 kubelet[2877]: E1009 02:47:00.695147 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695287 kubelet[2877]: E1009 02:47:00.695279 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.695287 kubelet[2877]: W1009 02:47:00.695287 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.695330 kubelet[2877]: E1009 02:47:00.695296 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695506 kubelet[2877]: E1009 02:47:00.695367 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695506 kubelet[2877]: E1009 02:47:00.695392 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695506 kubelet[2877]: E1009 02:47:00.695406 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695594 kubelet[2877]: E1009 02:47:00.695541 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.695594 kubelet[2877]: W1009 02:47:00.695548 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.695594 kubelet[2877]: E1009 02:47:00.695572 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695893 kubelet[2877]: E1009 02:47:00.695745 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.695893 kubelet[2877]: W1009 02:47:00.695755 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.695893 kubelet[2877]: E1009 02:47:00.695765 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.695978 kubelet[2877]: E1009 02:47:00.695919 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.695978 kubelet[2877]: W1009 02:47:00.695926 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.695978 kubelet[2877]: E1009 02:47:00.695935 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.696134 kubelet[2877]: E1009 02:47:00.696072 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.696134 kubelet[2877]: W1009 02:47:00.696083 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.696134 kubelet[2877]: E1009 02:47:00.696130 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696299 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.697571 kubelet[2877]: W1009 02:47:00.696307 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696332 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696526 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.697571 kubelet[2877]: W1009 02:47:00.696532 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696554 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696755 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.697571 kubelet[2877]: W1009 02:47:00.696761 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696774 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.697571 kubelet[2877]: E1009 02:47:00.696918 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.697829 kubelet[2877]: W1009 02:47:00.696924 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.697829 kubelet[2877]: E1009 02:47:00.696933 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.697829 kubelet[2877]: E1009 02:47:00.697064 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.697829 kubelet[2877]: W1009 02:47:00.697070 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.697829 kubelet[2877]: E1009 02:47:00.697079 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.697829 kubelet[2877]: E1009 02:47:00.697232 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.697829 kubelet[2877]: W1009 02:47:00.697239 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.697829 kubelet[2877]: E1009 02:47:00.697248 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.703684 kubelet[2877]: E1009 02:47:00.703629 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.703684 kubelet[2877]: W1009 02:47:00.703640 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.703684 kubelet[2877]: E1009 02:47:00.703654 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.740176 containerd[1509]: time="2024-10-09T02:47:00.740116965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b59df6465-xh92c,Uid:18b9a564-1cd6-4adc-a356-26372ffdcfff,Namespace:calico-system,Attempt:0,}" Oct 9 02:47:00.775752 containerd[1509]: time="2024-10-09T02:47:00.775513390Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:00.775752 containerd[1509]: time="2024-10-09T02:47:00.775604513Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:00.775752 containerd[1509]: time="2024-10-09T02:47:00.775624101Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:00.776218 containerd[1509]: time="2024-10-09T02:47:00.775724842Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:00.784780 kubelet[2877]: E1009 02:47:00.784155 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.784780 kubelet[2877]: W1009 02:47:00.784172 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.784780 kubelet[2877]: E1009 02:47:00.784190 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.784780 kubelet[2877]: E1009 02:47:00.784403 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.784780 kubelet[2877]: W1009 02:47:00.784411 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.784780 kubelet[2877]: E1009 02:47:00.784442 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.784780 kubelet[2877]: E1009 02:47:00.784621 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.784780 kubelet[2877]: W1009 02:47:00.784628 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.784780 kubelet[2877]: E1009 02:47:00.784645 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.785122 kubelet[2877]: E1009 02:47:00.785031 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.785122 kubelet[2877]: W1009 02:47:00.785041 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.785122 kubelet[2877]: E1009 02:47:00.785052 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.785311 kubelet[2877]: E1009 02:47:00.785301 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.785360 kubelet[2877]: W1009 02:47:00.785351 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.785410 kubelet[2877]: E1009 02:47:00.785402 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.785738 containerd[1509]: time="2024-10-09T02:47:00.785703386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-89x8m,Uid:ad94d7f1-a5ec-43ac-936a-96d757e957c5,Namespace:calico-system,Attempt:0,}" Oct 9 02:47:00.786172 kubelet[2877]: E1009 02:47:00.786152 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.786497 kubelet[2877]: W1009 02:47:00.786477 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.786497 kubelet[2877]: E1009 02:47:00.786503 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.787110 kubelet[2877]: E1009 02:47:00.786840 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.787110 kubelet[2877]: W1009 02:47:00.786972 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.787703 kubelet[2877]: E1009 02:47:00.787681 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.788055 kubelet[2877]: E1009 02:47:00.788033 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.788207 kubelet[2877]: W1009 02:47:00.788155 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.788207 kubelet[2877]: E1009 02:47:00.788189 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.789391 kubelet[2877]: E1009 02:47:00.789008 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.789391 kubelet[2877]: W1009 02:47:00.789019 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.789391 kubelet[2877]: E1009 02:47:00.789322 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.790130 kubelet[2877]: E1009 02:47:00.789757 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.790130 kubelet[2877]: W1009 02:47:00.789770 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.790130 kubelet[2877]: E1009 02:47:00.789818 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.790215 kubelet[2877]: E1009 02:47:00.790134 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.790215 kubelet[2877]: W1009 02:47:00.790176 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.790255 kubelet[2877]: E1009 02:47:00.790243 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.790709 kubelet[2877]: E1009 02:47:00.790667 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.790709 kubelet[2877]: W1009 02:47:00.790678 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.791169 kubelet[2877]: E1009 02:47:00.790805 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.791169 kubelet[2877]: E1009 02:47:00.791118 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.791169 kubelet[2877]: W1009 02:47:00.791126 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.791475 kubelet[2877]: E1009 02:47:00.791312 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.793447 kubelet[2877]: E1009 02:47:00.792187 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.793447 kubelet[2877]: W1009 02:47:00.792198 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.793447 kubelet[2877]: E1009 02:47:00.792413 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.793447 kubelet[2877]: E1009 02:47:00.792762 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.793447 kubelet[2877]: W1009 02:47:00.792770 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.793447 kubelet[2877]: E1009 02:47:00.793179 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.793580 kubelet[2877]: E1009 02:47:00.793467 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.793580 kubelet[2877]: W1009 02:47:00.793475 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.793778 kubelet[2877]: E1009 02:47:00.793759 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.794296 kubelet[2877]: E1009 02:47:00.794273 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.794296 kubelet[2877]: W1009 02:47:00.794291 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.794675 kubelet[2877]: E1009 02:47:00.794654 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.795656 kubelet[2877]: E1009 02:47:00.794859 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.795656 kubelet[2877]: W1009 02:47:00.794872 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.795656 kubelet[2877]: E1009 02:47:00.795138 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.795656 kubelet[2877]: E1009 02:47:00.795288 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.795656 kubelet[2877]: W1009 02:47:00.795296 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.795789 kubelet[2877]: E1009 02:47:00.795670 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.795812 kubelet[2877]: E1009 02:47:00.795793 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.795812 kubelet[2877]: W1009 02:47:00.795800 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.796525 kubelet[2877]: E1009 02:47:00.795892 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.798017 kubelet[2877]: E1009 02:47:00.798004 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.798107 kubelet[2877]: W1009 02:47:00.798094 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.798394 kubelet[2877]: E1009 02:47:00.798382 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.798490 kubelet[2877]: W1009 02:47:00.798478 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.798724 kubelet[2877]: E1009 02:47:00.798713 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.798853 kubelet[2877]: W1009 02:47:00.798780 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.799014 kubelet[2877]: E1009 02:47:00.799003 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.799171 kubelet[2877]: W1009 02:47:00.799053 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.799171 kubelet[2877]: E1009 02:47:00.799067 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.799171 kubelet[2877]: E1009 02:47:00.799087 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.799346 kubelet[2877]: E1009 02:47:00.799336 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.799402 kubelet[2877]: W1009 02:47:00.799392 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.799514 kubelet[2877]: E1009 02:47:00.799501 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.799975 kubelet[2877]: E1009 02:47:00.799749 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.799975 kubelet[2877]: E1009 02:47:00.799768 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.801997 systemd[1]: Started cri-containerd-153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87.scope - libcontainer container 153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87. Oct 9 02:47:00.815770 kubelet[2877]: E1009 02:47:00.815686 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:00.816488 kubelet[2877]: W1009 02:47:00.816000 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:00.816488 kubelet[2877]: E1009 02:47:00.816422 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:00.902510 containerd[1509]: time="2024-10-09T02:47:00.901558413Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:00.902510 containerd[1509]: time="2024-10-09T02:47:00.901616603Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:00.902510 containerd[1509]: time="2024-10-09T02:47:00.901634778Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:00.902510 containerd[1509]: time="2024-10-09T02:47:00.901697266Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:00.913456 containerd[1509]: time="2024-10-09T02:47:00.913389783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b59df6465-xh92c,Uid:18b9a564-1cd6-4adc-a356-26372ffdcfff,Namespace:calico-system,Attempt:0,} returns sandbox id \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\"" Oct 9 02:47:00.915265 containerd[1509]: time="2024-10-09T02:47:00.915238360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.28.1\"" Oct 9 02:47:00.932576 systemd[1]: Started cri-containerd-cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e.scope - libcontainer container cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e. Oct 9 02:47:00.983706 containerd[1509]: time="2024-10-09T02:47:00.983673247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-89x8m,Uid:ad94d7f1-a5ec-43ac-936a-96d757e957c5,Namespace:calico-system,Attempt:0,} returns sandbox id \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\"" Oct 9 02:47:02.729844 kubelet[2877]: E1009 02:47:02.729782 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:03.269870 containerd[1509]: time="2024-10-09T02:47:03.269807012Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:03.270984 containerd[1509]: time="2024-10-09T02:47:03.270823199Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.28.1: active requests=0, bytes read=29471335" Oct 9 02:47:03.275647 containerd[1509]: time="2024-10-09T02:47:03.275521820Z" level=info msg="ImageCreate event name:\"sha256:a19ab150adede78dd36481226e260735eb3b811481c6765aec79e8da6ae78b7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:03.278648 containerd[1509]: time="2024-10-09T02:47:03.277728413Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d97114d8e1e5186f1180fc8ef5f1309e0a8bf97efce35e0a0223d057d78d95fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:03.278648 containerd[1509]: time="2024-10-09T02:47:03.278306448Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.28.1\" with image id \"sha256:a19ab150adede78dd36481226e260735eb3b811481c6765aec79e8da6ae78b7f\", repo tag \"ghcr.io/flatcar/calico/typha:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d97114d8e1e5186f1180fc8ef5f1309e0a8bf97efce35e0a0223d057d78d95fb\", size \"30963728\" in 2.363038612s" Oct 9 02:47:03.278648 containerd[1509]: time="2024-10-09T02:47:03.278330715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.28.1\" returns image reference \"sha256:a19ab150adede78dd36481226e260735eb3b811481c6765aec79e8da6ae78b7f\"" Oct 9 02:47:03.279905 containerd[1509]: time="2024-10-09T02:47:03.279879452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\"" Oct 9 02:47:03.293636 containerd[1509]: time="2024-10-09T02:47:03.293598285Z" level=info msg="CreateContainer within sandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 9 02:47:03.317840 containerd[1509]: time="2024-10-09T02:47:03.317796960Z" level=info msg="CreateContainer within sandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\"" Oct 9 02:47:03.321145 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3441029540.mount: Deactivated successfully. Oct 9 02:47:03.322577 containerd[1509]: time="2024-10-09T02:47:03.322483998Z" level=info msg="StartContainer for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\"" Oct 9 02:47:03.373600 systemd[1]: Started cri-containerd-f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985.scope - libcontainer container f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985. Oct 9 02:47:03.413811 containerd[1509]: time="2024-10-09T02:47:03.413676866Z" level=info msg="StartContainer for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" returns successfully" Oct 9 02:47:03.882075 containerd[1509]: time="2024-10-09T02:47:03.882036352Z" level=info msg="StopContainer for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" with timeout 300 (s)" Oct 9 02:47:03.894615 containerd[1509]: time="2024-10-09T02:47:03.894314965Z" level=info msg="Stop container \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" with signal terminated" Oct 9 02:47:03.913264 systemd[1]: cri-containerd-f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985.scope: Deactivated successfully. Oct 9 02:47:03.977085 containerd[1509]: time="2024-10-09T02:47:03.971311735Z" level=info msg="shim disconnected" id=f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985 namespace=k8s.io Oct 9 02:47:03.977085 containerd[1509]: time="2024-10-09T02:47:03.977081054Z" level=warning msg="cleaning up after shim disconnected" id=f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985 namespace=k8s.io Oct 9 02:47:03.977085 containerd[1509]: time="2024-10-09T02:47:03.977093258Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 9 02:47:03.991448 containerd[1509]: time="2024-10-09T02:47:03.991403954Z" level=info msg="StopContainer for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" returns successfully" Oct 9 02:47:03.991984 containerd[1509]: time="2024-10-09T02:47:03.991967022Z" level=info msg="StopPodSandbox for \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\"" Oct 9 02:47:03.999398 containerd[1509]: time="2024-10-09T02:47:03.994980235Z" level=info msg="Container to stop \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Oct 9 02:47:04.006242 systemd[1]: cri-containerd-153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87.scope: Deactivated successfully. Oct 9 02:47:04.031716 containerd[1509]: time="2024-10-09T02:47:04.030572592Z" level=info msg="shim disconnected" id=153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87 namespace=k8s.io Oct 9 02:47:04.031716 containerd[1509]: time="2024-10-09T02:47:04.030617427Z" level=warning msg="cleaning up after shim disconnected" id=153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87 namespace=k8s.io Oct 9 02:47:04.031716 containerd[1509]: time="2024-10-09T02:47:04.030625773Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 9 02:47:04.045331 containerd[1509]: time="2024-10-09T02:47:04.045295534Z" level=warning msg="cleanup warnings time=\"2024-10-09T02:47:04Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Oct 9 02:47:04.046521 containerd[1509]: time="2024-10-09T02:47:04.046486822Z" level=info msg="TearDown network for sandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" successfully" Oct 9 02:47:04.046521 containerd[1509]: time="2024-10-09T02:47:04.046509475Z" level=info msg="StopPodSandbox for \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" returns successfully" Oct 9 02:47:04.065459 kubelet[2877]: I1009 02:47:04.064880 2877 topology_manager.go:215] "Topology Admit Handler" podUID="9ee3d131-9ac5-44ba-87ce-b36bda3cfadc" podNamespace="calico-system" podName="calico-typha-77b7bb8b6c-kd22t" Oct 9 02:47:04.065459 kubelet[2877]: E1009 02:47:04.064935 2877 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="18b9a564-1cd6-4adc-a356-26372ffdcfff" containerName="calico-typha" Oct 9 02:47:04.065459 kubelet[2877]: I1009 02:47:04.064957 2877 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b9a564-1cd6-4adc-a356-26372ffdcfff" containerName="calico-typha" Oct 9 02:47:04.073516 systemd[1]: Created slice kubepods-besteffort-pod9ee3d131_9ac5_44ba_87ce_b36bda3cfadc.slice - libcontainer container kubepods-besteffort-pod9ee3d131_9ac5_44ba_87ce_b36bda3cfadc.slice. Oct 9 02:47:04.075510 kubelet[2877]: E1009 02:47:04.075486 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.075510 kubelet[2877]: W1009 02:47:04.075504 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.075654 kubelet[2877]: E1009 02:47:04.075522 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.076968 kubelet[2877]: E1009 02:47:04.076935 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.076968 kubelet[2877]: W1009 02:47:04.076951 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.076968 kubelet[2877]: E1009 02:47:04.076964 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.077151 kubelet[2877]: E1009 02:47:04.077133 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.077151 kubelet[2877]: W1009 02:47:04.077147 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.077202 kubelet[2877]: E1009 02:47:04.077156 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.077338 kubelet[2877]: E1009 02:47:04.077316 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.077338 kubelet[2877]: W1009 02:47:04.077327 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.077338 kubelet[2877]: E1009 02:47:04.077336 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.077543 kubelet[2877]: E1009 02:47:04.077525 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.077543 kubelet[2877]: W1009 02:47:04.077539 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.077616 kubelet[2877]: E1009 02:47:04.077552 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.077746 kubelet[2877]: E1009 02:47:04.077731 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.077746 kubelet[2877]: W1009 02:47:04.077743 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.077816 kubelet[2877]: E1009 02:47:04.077753 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.077946 kubelet[2877]: E1009 02:47:04.077930 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.077946 kubelet[2877]: W1009 02:47:04.077943 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.078001 kubelet[2877]: E1009 02:47:04.077954 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.078136 kubelet[2877]: E1009 02:47:04.078117 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.078136 kubelet[2877]: W1009 02:47:04.078127 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.078136 kubelet[2877]: E1009 02:47:04.078136 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.078388 kubelet[2877]: E1009 02:47:04.078338 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.078388 kubelet[2877]: W1009 02:47:04.078350 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.078388 kubelet[2877]: E1009 02:47:04.078362 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.078695 kubelet[2877]: E1009 02:47:04.078581 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.078695 kubelet[2877]: W1009 02:47:04.078590 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.078695 kubelet[2877]: E1009 02:47:04.078600 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.078809 kubelet[2877]: E1009 02:47:04.078766 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.078809 kubelet[2877]: W1009 02:47:04.078773 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.078809 kubelet[2877]: E1009 02:47:04.078783 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.078809 kubelet[2877]: E1009 02:47:04.078937 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.078809 kubelet[2877]: W1009 02:47:04.078944 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.078809 kubelet[2877]: E1009 02:47:04.078953 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.106298 kubelet[2877]: E1009 02:47:04.106265 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.106298 kubelet[2877]: W1009 02:47:04.106283 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.106298 kubelet[2877]: E1009 02:47:04.106304 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.106520 kubelet[2877]: I1009 02:47:04.106335 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkbdt\" (UniqueName: \"kubernetes.io/projected/18b9a564-1cd6-4adc-a356-26372ffdcfff-kube-api-access-kkbdt\") pod \"18b9a564-1cd6-4adc-a356-26372ffdcfff\" (UID: \"18b9a564-1cd6-4adc-a356-26372ffdcfff\") " Oct 9 02:47:04.106899 kubelet[2877]: E1009 02:47:04.106598 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.106899 kubelet[2877]: W1009 02:47:04.106623 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.106899 kubelet[2877]: E1009 02:47:04.106660 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.106899 kubelet[2877]: I1009 02:47:04.106680 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18b9a564-1cd6-4adc-a356-26372ffdcfff-tigera-ca-bundle\") pod \"18b9a564-1cd6-4adc-a356-26372ffdcfff\" (UID: \"18b9a564-1cd6-4adc-a356-26372ffdcfff\") " Oct 9 02:47:04.107342 kubelet[2877]: E1009 02:47:04.106960 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.107342 kubelet[2877]: W1009 02:47:04.106969 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.107342 kubelet[2877]: E1009 02:47:04.106982 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.107342 kubelet[2877]: I1009 02:47:04.106999 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/18b9a564-1cd6-4adc-a356-26372ffdcfff-typha-certs\") pod \"18b9a564-1cd6-4adc-a356-26372ffdcfff\" (UID: \"18b9a564-1cd6-4adc-a356-26372ffdcfff\") " Oct 9 02:47:04.107342 kubelet[2877]: E1009 02:47:04.107167 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.107342 kubelet[2877]: W1009 02:47:04.107174 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.107342 kubelet[2877]: E1009 02:47:04.107185 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.107342 kubelet[2877]: I1009 02:47:04.107204 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4nxm\" (UniqueName: \"kubernetes.io/projected/9ee3d131-9ac5-44ba-87ce-b36bda3cfadc-kube-api-access-q4nxm\") pod \"calico-typha-77b7bb8b6c-kd22t\" (UID: \"9ee3d131-9ac5-44ba-87ce-b36bda3cfadc\") " pod="calico-system/calico-typha-77b7bb8b6c-kd22t" Oct 9 02:47:04.108350 kubelet[2877]: E1009 02:47:04.108053 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.108350 kubelet[2877]: W1009 02:47:04.108062 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.108350 kubelet[2877]: E1009 02:47:04.108072 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.108350 kubelet[2877]: I1009 02:47:04.108089 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ee3d131-9ac5-44ba-87ce-b36bda3cfadc-tigera-ca-bundle\") pod \"calico-typha-77b7bb8b6c-kd22t\" (UID: \"9ee3d131-9ac5-44ba-87ce-b36bda3cfadc\") " pod="calico-system/calico-typha-77b7bb8b6c-kd22t" Oct 9 02:47:04.108517 kubelet[2877]: E1009 02:47:04.108486 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.108517 kubelet[2877]: W1009 02:47:04.108502 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.108517 kubelet[2877]: E1009 02:47:04.108514 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.108691 kubelet[2877]: I1009 02:47:04.108531 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9ee3d131-9ac5-44ba-87ce-b36bda3cfadc-typha-certs\") pod \"calico-typha-77b7bb8b6c-kd22t\" (UID: \"9ee3d131-9ac5-44ba-87ce-b36bda3cfadc\") " pod="calico-system/calico-typha-77b7bb8b6c-kd22t" Oct 9 02:47:04.108835 kubelet[2877]: E1009 02:47:04.108746 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.108835 kubelet[2877]: W1009 02:47:04.108755 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.108835 kubelet[2877]: E1009 02:47:04.108765 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.109260 kubelet[2877]: E1009 02:47:04.108965 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.111463 kubelet[2877]: W1009 02:47:04.108975 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.111463 kubelet[2877]: E1009 02:47:04.110468 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.111463 kubelet[2877]: E1009 02:47:04.110668 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.111463 kubelet[2877]: W1009 02:47:04.110676 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.111463 kubelet[2877]: E1009 02:47:04.110687 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.113534 kubelet[2877]: E1009 02:47:04.113518 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.113534 kubelet[2877]: W1009 02:47:04.113532 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.113729 kubelet[2877]: E1009 02:47:04.113548 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.114562 kubelet[2877]: E1009 02:47:04.114545 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.114562 kubelet[2877]: W1009 02:47:04.114558 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.114978 kubelet[2877]: E1009 02:47:04.114955 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.115682 kubelet[2877]: E1009 02:47:04.115516 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.115682 kubelet[2877]: W1009 02:47:04.115530 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.115989 kubelet[2877]: E1009 02:47:04.115974 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.116145 kubelet[2877]: I1009 02:47:04.116061 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b9a564-1cd6-4adc-a356-26372ffdcfff-kube-api-access-kkbdt" (OuterVolumeSpecName: "kube-api-access-kkbdt") pod "18b9a564-1cd6-4adc-a356-26372ffdcfff" (UID: "18b9a564-1cd6-4adc-a356-26372ffdcfff"). InnerVolumeSpecName "kube-api-access-kkbdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 9 02:47:04.116407 kubelet[2877]: E1009 02:47:04.116353 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.116407 kubelet[2877]: W1009 02:47:04.116363 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.116670 kubelet[2877]: E1009 02:47:04.116633 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.116670 kubelet[2877]: W1009 02:47:04.116645 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.116967 kubelet[2877]: E1009 02:47:04.116708 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.117193 kubelet[2877]: E1009 02:47:04.117077 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.117193 kubelet[2877]: W1009 02:47:04.117087 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.117193 kubelet[2877]: E1009 02:47:04.117098 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.117193 kubelet[2877]: E1009 02:47:04.117122 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.117544 kubelet[2877]: I1009 02:47:04.117498 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b9a564-1cd6-4adc-a356-26372ffdcfff-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "18b9a564-1cd6-4adc-a356-26372ffdcfff" (UID: "18b9a564-1cd6-4adc-a356-26372ffdcfff"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 9 02:47:04.121632 kubelet[2877]: I1009 02:47:04.121594 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b9a564-1cd6-4adc-a356-26372ffdcfff-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "18b9a564-1cd6-4adc-a356-26372ffdcfff" (UID: "18b9a564-1cd6-4adc-a356-26372ffdcfff"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 9 02:47:04.209379 kubelet[2877]: E1009 02:47:04.209274 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.209379 kubelet[2877]: W1009 02:47:04.209297 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.209379 kubelet[2877]: E1009 02:47:04.209329 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.210782 kubelet[2877]: E1009 02:47:04.209577 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.210782 kubelet[2877]: W1009 02:47:04.209585 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.210782 kubelet[2877]: E1009 02:47:04.209597 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.210782 kubelet[2877]: E1009 02:47:04.209752 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.210782 kubelet[2877]: W1009 02:47:04.209759 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.210782 kubelet[2877]: E1009 02:47:04.209770 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.210782 kubelet[2877]: I1009 02:47:04.209807 2877 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-kkbdt\" (UniqueName: \"kubernetes.io/projected/18b9a564-1cd6-4adc-a356-26372ffdcfff-kube-api-access-kkbdt\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:04.210782 kubelet[2877]: I1009 02:47:04.209818 2877 reconciler_common.go:300] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/18b9a564-1cd6-4adc-a356-26372ffdcfff-typha-certs\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:04.210782 kubelet[2877]: I1009 02:47:04.209828 2877 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18b9a564-1cd6-4adc-a356-26372ffdcfff-tigera-ca-bundle\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:04.210782 kubelet[2877]: E1009 02:47:04.209970 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.211005 kubelet[2877]: W1009 02:47:04.209977 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.211005 kubelet[2877]: E1009 02:47:04.209987 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.211005 kubelet[2877]: E1009 02:47:04.210150 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.211005 kubelet[2877]: W1009 02:47:04.210161 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.211005 kubelet[2877]: E1009 02:47:04.210178 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.211005 kubelet[2877]: E1009 02:47:04.210326 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.211005 kubelet[2877]: W1009 02:47:04.210335 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.211005 kubelet[2877]: E1009 02:47:04.210344 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.211005 kubelet[2877]: E1009 02:47:04.210524 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.211005 kubelet[2877]: W1009 02:47:04.210533 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.211219 kubelet[2877]: E1009 02:47:04.210542 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.212960 kubelet[2877]: E1009 02:47:04.211957 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.212960 kubelet[2877]: W1009 02:47:04.211972 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.212960 kubelet[2877]: E1009 02:47:04.211987 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.212960 kubelet[2877]: E1009 02:47:04.212545 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.212960 kubelet[2877]: W1009 02:47:04.212555 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.212960 kubelet[2877]: E1009 02:47:04.212568 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.213539 kubelet[2877]: E1009 02:47:04.213519 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.213539 kubelet[2877]: W1009 02:47:04.213532 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.213616 kubelet[2877]: E1009 02:47:04.213556 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.213900 kubelet[2877]: E1009 02:47:04.213757 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.213900 kubelet[2877]: W1009 02:47:04.213767 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.213900 kubelet[2877]: E1009 02:47:04.213811 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.214177 kubelet[2877]: E1009 02:47:04.214155 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.214177 kubelet[2877]: W1009 02:47:04.214169 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.214558 kubelet[2877]: E1009 02:47:04.214307 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.214558 kubelet[2877]: W1009 02:47:04.214316 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.214558 kubelet[2877]: E1009 02:47:04.214328 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.214558 kubelet[2877]: E1009 02:47:04.214351 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.215140 kubelet[2877]: E1009 02:47:04.214942 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.215140 kubelet[2877]: W1009 02:47:04.214951 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.215140 kubelet[2877]: E1009 02:47:04.214964 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.215232 kubelet[2877]: E1009 02:47:04.215166 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.215232 kubelet[2877]: W1009 02:47:04.215179 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.215232 kubelet[2877]: E1009 02:47:04.215200 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.215581 kubelet[2877]: E1009 02:47:04.215483 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.215581 kubelet[2877]: W1009 02:47:04.215494 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.215581 kubelet[2877]: E1009 02:47:04.215516 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.215898 kubelet[2877]: E1009 02:47:04.215725 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.215898 kubelet[2877]: W1009 02:47:04.215737 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.215898 kubelet[2877]: E1009 02:47:04.215747 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.222977 kubelet[2877]: E1009 02:47:04.222955 2877 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 9 02:47:04.222977 kubelet[2877]: W1009 02:47:04.222967 2877 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 9 02:47:04.222977 kubelet[2877]: E1009 02:47:04.222979 2877 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 9 02:47:04.288897 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985-rootfs.mount: Deactivated successfully. Oct 9 02:47:04.289224 systemd[1]: var-lib-kubelet-pods-18b9a564\x2d1cd6\x2d4adc\x2da356\x2d26372ffdcfff-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Oct 9 02:47:04.289529 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87-rootfs.mount: Deactivated successfully. Oct 9 02:47:04.289809 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87-shm.mount: Deactivated successfully. Oct 9 02:47:04.290121 systemd[1]: var-lib-kubelet-pods-18b9a564\x2d1cd6\x2d4adc\x2da356\x2d26372ffdcfff-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkkbdt.mount: Deactivated successfully. Oct 9 02:47:04.290323 systemd[1]: var-lib-kubelet-pods-18b9a564\x2d1cd6\x2d4adc\x2da356\x2d26372ffdcfff-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Oct 9 02:47:04.377975 containerd[1509]: time="2024-10-09T02:47:04.377876131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77b7bb8b6c-kd22t,Uid:9ee3d131-9ac5-44ba-87ce-b36bda3cfadc,Namespace:calico-system,Attempt:0,}" Oct 9 02:47:04.409874 containerd[1509]: time="2024-10-09T02:47:04.409392457Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:04.409874 containerd[1509]: time="2024-10-09T02:47:04.409477930Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:04.409874 containerd[1509]: time="2024-10-09T02:47:04.409491174Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:04.409874 containerd[1509]: time="2024-10-09T02:47:04.409575244Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:04.431572 systemd[1]: Started cri-containerd-0839cf39046e4d0f93d68c19179c65ff11d3d78cc3128d04615fc2f39ad6119d.scope - libcontainer container 0839cf39046e4d0f93d68c19179c65ff11d3d78cc3128d04615fc2f39ad6119d. Oct 9 02:47:04.485804 containerd[1509]: time="2024-10-09T02:47:04.485631900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77b7bb8b6c-kd22t,Uid:9ee3d131-9ac5-44ba-87ce-b36bda3cfadc,Namespace:calico-system,Attempt:0,} returns sandbox id \"0839cf39046e4d0f93d68c19179c65ff11d3d78cc3128d04615fc2f39ad6119d\"" Oct 9 02:47:04.499012 containerd[1509]: time="2024-10-09T02:47:04.498967521Z" level=info msg="CreateContainer within sandbox \"0839cf39046e4d0f93d68c19179c65ff11d3d78cc3128d04615fc2f39ad6119d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 9 02:47:04.510518 containerd[1509]: time="2024-10-09T02:47:04.510390026Z" level=info msg="CreateContainer within sandbox \"0839cf39046e4d0f93d68c19179c65ff11d3d78cc3128d04615fc2f39ad6119d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c3a0fa2d08ef91e33840417e33660b360c85e8a790a870aa2be173d46fdec2a9\"" Oct 9 02:47:04.512004 containerd[1509]: time="2024-10-09T02:47:04.511969961Z" level=info msg="StartContainer for \"c3a0fa2d08ef91e33840417e33660b360c85e8a790a870aa2be173d46fdec2a9\"" Oct 9 02:47:04.555608 systemd[1]: Started cri-containerd-c3a0fa2d08ef91e33840417e33660b360c85e8a790a870aa2be173d46fdec2a9.scope - libcontainer container c3a0fa2d08ef91e33840417e33660b360c85e8a790a870aa2be173d46fdec2a9. Oct 9 02:47:04.594927 containerd[1509]: time="2024-10-09T02:47:04.594885920Z" level=info msg="StartContainer for \"c3a0fa2d08ef91e33840417e33660b360c85e8a790a870aa2be173d46fdec2a9\" returns successfully" Oct 9 02:47:04.730512 kubelet[2877]: E1009 02:47:04.730257 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:04.777411 containerd[1509]: time="2024-10-09T02:47:04.777190398Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:04.778414 containerd[1509]: time="2024-10-09T02:47:04.778377028Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1: active requests=0, bytes read=5141007" Oct 9 02:47:04.779571 containerd[1509]: time="2024-10-09T02:47:04.779476041Z" level=info msg="ImageCreate event name:\"sha256:00564b1c843430f804fda219f98769c25b538adebc11504477d5ee331fd8f85b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:04.781856 containerd[1509]: time="2024-10-09T02:47:04.781745293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:7938ad0cb2b49a32937962cc40dd826ad5858999c603bdf5fbf2092a4d50cf01\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:04.782639 containerd[1509]: time="2024-10-09T02:47:04.782386128Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\" with image id \"sha256:00564b1c843430f804fda219f98769c25b538adebc11504477d5ee331fd8f85b\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:7938ad0cb2b49a32937962cc40dd826ad5858999c603bdf5fbf2092a4d50cf01\", size \"6633368\" in 1.502478523s" Oct 9 02:47:04.782639 containerd[1509]: time="2024-10-09T02:47:04.782410424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.1\" returns image reference \"sha256:00564b1c843430f804fda219f98769c25b538adebc11504477d5ee331fd8f85b\"" Oct 9 02:47:04.785256 containerd[1509]: time="2024-10-09T02:47:04.785229037Z" level=info msg="CreateContainer within sandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 9 02:47:04.804587 containerd[1509]: time="2024-10-09T02:47:04.804517132Z" level=info msg="CreateContainer within sandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd\"" Oct 9 02:47:04.805263 containerd[1509]: time="2024-10-09T02:47:04.805067416Z" level=info msg="StartContainer for \"569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd\"" Oct 9 02:47:04.840695 systemd[1]: Started cri-containerd-569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd.scope - libcontainer container 569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd. Oct 9 02:47:04.883645 containerd[1509]: time="2024-10-09T02:47:04.883526697Z" level=info msg="StartContainer for \"569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd\" returns successfully" Oct 9 02:47:04.889553 kubelet[2877]: I1009 02:47:04.889363 2877 scope.go:117] "RemoveContainer" containerID="f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985" Oct 9 02:47:04.901177 systemd[1]: Removed slice kubepods-besteffort-pod18b9a564_1cd6_4adc_a356_26372ffdcfff.slice - libcontainer container kubepods-besteffort-pod18b9a564_1cd6_4adc_a356_26372ffdcfff.slice. Oct 9 02:47:04.914068 systemd[1]: cri-containerd-569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd.scope: Deactivated successfully. Oct 9 02:47:04.928534 containerd[1509]: time="2024-10-09T02:47:04.928482663Z" level=info msg="RemoveContainer for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\"" Oct 9 02:47:04.947248 containerd[1509]: time="2024-10-09T02:47:04.947212681Z" level=info msg="RemoveContainer for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" returns successfully" Oct 9 02:47:04.947949 kubelet[2877]: I1009 02:47:04.947829 2877 scope.go:117] "RemoveContainer" containerID="f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985" Oct 9 02:47:04.948324 containerd[1509]: time="2024-10-09T02:47:04.948020402Z" level=error msg="ContainerStatus for \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\": not found" Oct 9 02:47:04.948696 kubelet[2877]: E1009 02:47:04.948283 2877 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\": not found" containerID="f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985" Oct 9 02:47:04.950238 kubelet[2877]: I1009 02:47:04.949415 2877 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985"} err="failed to get container status \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\": rpc error: code = NotFound desc = an error occurred when try to find container \"f119cb0b676920cff7da61663432c78504497f90978c140840ba2a7eb47f2985\": not found" Oct 9 02:47:04.963373 kubelet[2877]: I1009 02:47:04.962502 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-77b7bb8b6c-kd22t" podStartSLOduration=4.962468251 podStartE2EDuration="4.962468251s" podCreationTimestamp="2024-10-09 02:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:47:04.95579456 +0000 UTC m=+23.347045005" watchObservedRunningTime="2024-10-09 02:47:04.962468251 +0000 UTC m=+23.353718697" Oct 9 02:47:04.983674 containerd[1509]: time="2024-10-09T02:47:04.983604351Z" level=info msg="shim disconnected" id=569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd namespace=k8s.io Oct 9 02:47:04.983674 containerd[1509]: time="2024-10-09T02:47:04.983667902Z" level=warning msg="cleaning up after shim disconnected" id=569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd namespace=k8s.io Oct 9 02:47:04.983674 containerd[1509]: time="2024-10-09T02:47:04.983676037Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 9 02:47:05.733974 kubelet[2877]: I1009 02:47:05.733568 2877 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="18b9a564-1cd6-4adc-a356-26372ffdcfff" path="/var/lib/kubelet/pods/18b9a564-1cd6-4adc-a356-26372ffdcfff/volumes" Oct 9 02:47:05.936126 containerd[1509]: time="2024-10-09T02:47:05.936062847Z" level=info msg="StopPodSandbox for \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\"" Oct 9 02:47:05.936126 containerd[1509]: time="2024-10-09T02:47:05.936102702Z" level=info msg="Container to stop \"569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Oct 9 02:47:05.940769 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e-shm.mount: Deactivated successfully. Oct 9 02:47:05.948616 systemd[1]: cri-containerd-cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e.scope: Deactivated successfully. Oct 9 02:47:05.972246 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e-rootfs.mount: Deactivated successfully. Oct 9 02:47:05.979253 containerd[1509]: time="2024-10-09T02:47:05.979078357Z" level=info msg="shim disconnected" id=cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e namespace=k8s.io Oct 9 02:47:05.979253 containerd[1509]: time="2024-10-09T02:47:05.979126608Z" level=warning msg="cleaning up after shim disconnected" id=cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e namespace=k8s.io Oct 9 02:47:05.979253 containerd[1509]: time="2024-10-09T02:47:05.979134394Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 9 02:47:05.995082 containerd[1509]: time="2024-10-09T02:47:05.994986148Z" level=info msg="TearDown network for sandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" successfully" Oct 9 02:47:05.995316 containerd[1509]: time="2024-10-09T02:47:05.995160468Z" level=info msg="StopPodSandbox for \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" returns successfully" Oct 9 02:47:06.121808 kubelet[2877]: I1009 02:47:06.121776 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-lib-modules\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.121990 kubelet[2877]: I1009 02:47:06.121880 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-net-dir\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.121990 kubelet[2877]: I1009 02:47:06.121907 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-flexvol-driver-host\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.121990 kubelet[2877]: I1009 02:47:06.121935 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad94d7f1-a5ec-43ac-936a-96d757e957c5-tigera-ca-bundle\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.121990 kubelet[2877]: I1009 02:47:06.121962 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-run-calico\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.121990 kubelet[2877]: I1009 02:47:06.121991 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z2hq\" (UniqueName: \"kubernetes.io/projected/ad94d7f1-a5ec-43ac-936a-96d757e957c5-kube-api-access-4z2hq\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122175 kubelet[2877]: I1009 02:47:06.122023 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-xtables-lock\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122175 kubelet[2877]: I1009 02:47:06.122051 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-bin-dir\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122175 kubelet[2877]: I1009 02:47:06.122076 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-log-dir\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122175 kubelet[2877]: I1009 02:47:06.122108 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ad94d7f1-a5ec-43ac-936a-96d757e957c5-node-certs\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122175 kubelet[2877]: I1009 02:47:06.122134 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-policysync\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122175 kubelet[2877]: I1009 02:47:06.122167 2877 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-lib-calico\") pod \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\" (UID: \"ad94d7f1-a5ec-43ac-936a-96d757e957c5\") " Oct 9 02:47:06.122961 kubelet[2877]: I1009 02:47:06.122229 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.122961 kubelet[2877]: I1009 02:47:06.121845 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.122961 kubelet[2877]: I1009 02:47:06.122276 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.122961 kubelet[2877]: I1009 02:47:06.122292 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.122961 kubelet[2877]: I1009 02:47:06.122375 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.123136 kubelet[2877]: I1009 02:47:06.122402 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.123136 kubelet[2877]: I1009 02:47:06.122832 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad94d7f1-a5ec-43ac-936a-96d757e957c5-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 9 02:47:06.123136 kubelet[2877]: I1009 02:47:06.122865 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.125356 kubelet[2877]: I1009 02:47:06.124449 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.125356 kubelet[2877]: I1009 02:47:06.124492 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-policysync" (OuterVolumeSpecName: "policysync") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 9 02:47:06.127011 kubelet[2877]: I1009 02:47:06.126882 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94d7f1-a5ec-43ac-936a-96d757e957c5-node-certs" (OuterVolumeSpecName: "node-certs") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 9 02:47:06.129397 systemd[1]: var-lib-kubelet-pods-ad94d7f1\x2da5ec\x2d43ac\x2d936a\x2d96d757e957c5-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Oct 9 02:47:06.129812 kubelet[2877]: I1009 02:47:06.129721 2877 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad94d7f1-a5ec-43ac-936a-96d757e957c5-kube-api-access-4z2hq" (OuterVolumeSpecName: "kube-api-access-4z2hq") pod "ad94d7f1-a5ec-43ac-936a-96d757e957c5" (UID: "ad94d7f1-a5ec-43ac-936a-96d757e957c5"). InnerVolumeSpecName "kube-api-access-4z2hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 9 02:47:06.133096 systemd[1]: var-lib-kubelet-pods-ad94d7f1\x2da5ec\x2d43ac\x2d936a\x2d96d757e957c5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4z2hq.mount: Deactivated successfully. Oct 9 02:47:06.223379 kubelet[2877]: I1009 02:47:06.223343 2877 reconciler_common.go:300] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-policysync\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223379 kubelet[2877]: I1009 02:47:06.223376 2877 reconciler_common.go:300] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-lib-calico\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223393 2877 reconciler_common.go:300] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ad94d7f1-a5ec-43ac-936a-96d757e957c5-node-certs\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223408 2877 reconciler_common.go:300] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-net-dir\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223423 2877 reconciler_common.go:300] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-flexvol-driver-host\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223453 2877 reconciler_common.go:300] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-lib-modules\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223468 2877 reconciler_common.go:300] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-var-run-calico\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223483 2877 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-4z2hq\" (UniqueName: \"kubernetes.io/projected/ad94d7f1-a5ec-43ac-936a-96d757e957c5-kube-api-access-4z2hq\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223496 2877 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad94d7f1-a5ec-43ac-936a-96d757e957c5-tigera-ca-bundle\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223616 kubelet[2877]: I1009 02:47:06.223509 2877 reconciler_common.go:300] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-xtables-lock\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223830 kubelet[2877]: I1009 02:47:06.223619 2877 reconciler_common.go:300] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-bin-dir\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.223830 kubelet[2877]: I1009 02:47:06.223638 2877 reconciler_common.go:300] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ad94d7f1-a5ec-43ac-936a-96d757e957c5-cni-log-dir\") on node \"ci-4116-0-0-c-ec98df32e3\" DevicePath \"\"" Oct 9 02:47:06.730142 kubelet[2877]: E1009 02:47:06.730008 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:06.937797 kubelet[2877]: I1009 02:47:06.937526 2877 scope.go:117] "RemoveContainer" containerID="569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd" Oct 9 02:47:06.940486 containerd[1509]: time="2024-10-09T02:47:06.940425386Z" level=info msg="RemoveContainer for \"569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd\"" Oct 9 02:47:06.944666 systemd[1]: Removed slice kubepods-besteffort-podad94d7f1_a5ec_43ac_936a_96d757e957c5.slice - libcontainer container kubepods-besteffort-podad94d7f1_a5ec_43ac_936a_96d757e957c5.slice. Oct 9 02:47:06.945218 containerd[1509]: time="2024-10-09T02:47:06.944840052Z" level=info msg="RemoveContainer for \"569ac40bb23d26a7a5cf78a950573020f57d1ca4e90d399b0dd166c6e36fd6dd\" returns successfully" Oct 9 02:47:06.986547 kubelet[2877]: I1009 02:47:06.984922 2877 topology_manager.go:215] "Topology Admit Handler" podUID="619fd16f-a9a9-493c-90e2-1df3e95bfd81" podNamespace="calico-system" podName="calico-node-tk2k8" Oct 9 02:47:06.986547 kubelet[2877]: E1009 02:47:06.985272 2877 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ad94d7f1-a5ec-43ac-936a-96d757e957c5" containerName="flexvol-driver" Oct 9 02:47:06.986547 kubelet[2877]: I1009 02:47:06.985300 2877 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad94d7f1-a5ec-43ac-936a-96d757e957c5" containerName="flexvol-driver" Oct 9 02:47:06.995372 systemd[1]: Created slice kubepods-besteffort-pod619fd16f_a9a9_493c_90e2_1df3e95bfd81.slice - libcontainer container kubepods-besteffort-pod619fd16f_a9a9_493c_90e2_1df3e95bfd81.slice. Oct 9 02:47:07.029244 kubelet[2877]: I1009 02:47:07.029115 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-lib-modules\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029244 kubelet[2877]: I1009 02:47:07.029195 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-var-run-calico\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029244 kubelet[2877]: I1009 02:47:07.029217 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trrl\" (UniqueName: \"kubernetes.io/projected/619fd16f-a9a9-493c-90e2-1df3e95bfd81-kube-api-access-2trrl\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029244 kubelet[2877]: I1009 02:47:07.029241 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-xtables-lock\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029528 kubelet[2877]: I1009 02:47:07.029269 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-cni-log-dir\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029528 kubelet[2877]: I1009 02:47:07.029288 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-flexvol-driver-host\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029528 kubelet[2877]: I1009 02:47:07.029323 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/619fd16f-a9a9-493c-90e2-1df3e95bfd81-node-certs\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029528 kubelet[2877]: I1009 02:47:07.029339 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-var-lib-calico\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.029528 kubelet[2877]: I1009 02:47:07.029359 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-policysync\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.030120 kubelet[2877]: I1009 02:47:07.029373 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-cni-bin-dir\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.030120 kubelet[2877]: I1009 02:47:07.029820 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/619fd16f-a9a9-493c-90e2-1df3e95bfd81-tigera-ca-bundle\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.030120 kubelet[2877]: I1009 02:47:07.029847 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/619fd16f-a9a9-493c-90e2-1df3e95bfd81-cni-net-dir\") pod \"calico-node-tk2k8\" (UID: \"619fd16f-a9a9-493c-90e2-1df3e95bfd81\") " pod="calico-system/calico-node-tk2k8" Oct 9 02:47:07.300862 containerd[1509]: time="2024-10-09T02:47:07.300694626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tk2k8,Uid:619fd16f-a9a9-493c-90e2-1df3e95bfd81,Namespace:calico-system,Attempt:0,}" Oct 9 02:47:07.340158 containerd[1509]: time="2024-10-09T02:47:07.340033246Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:07.340747 containerd[1509]: time="2024-10-09T02:47:07.340121805Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:07.340747 containerd[1509]: time="2024-10-09T02:47:07.340142564Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:07.340747 containerd[1509]: time="2024-10-09T02:47:07.340261239Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:07.376570 systemd[1]: Started cri-containerd-8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6.scope - libcontainer container 8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6. Oct 9 02:47:07.400346 containerd[1509]: time="2024-10-09T02:47:07.400275963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tk2k8,Uid:619fd16f-a9a9-493c-90e2-1df3e95bfd81,Namespace:calico-system,Attempt:0,} returns sandbox id \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\"" Oct 9 02:47:07.402754 containerd[1509]: time="2024-10-09T02:47:07.402726615Z" level=info msg="CreateContainer within sandbox \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 9 02:47:07.412602 containerd[1509]: time="2024-10-09T02:47:07.412568413Z" level=info msg="CreateContainer within sandbox \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325\"" Oct 9 02:47:07.413222 containerd[1509]: time="2024-10-09T02:47:07.412878330Z" level=info msg="StartContainer for \"5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325\"" Oct 9 02:47:07.440771 systemd[1]: Started cri-containerd-5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325.scope - libcontainer container 5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325. Oct 9 02:47:07.477057 containerd[1509]: time="2024-10-09T02:47:07.477019901Z" level=info msg="StartContainer for \"5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325\" returns successfully" Oct 9 02:47:07.493553 systemd[1]: cri-containerd-5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325.scope: Deactivated successfully. Oct 9 02:47:07.518594 containerd[1509]: time="2024-10-09T02:47:07.518520589Z" level=info msg="shim disconnected" id=5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325 namespace=k8s.io Oct 9 02:47:07.518813 containerd[1509]: time="2024-10-09T02:47:07.518587566Z" level=warning msg="cleaning up after shim disconnected" id=5c564cd6bd8ad6b97f3e976f297b673df54c75679920409e13e380039da60325 namespace=k8s.io Oct 9 02:47:07.518813 containerd[1509]: time="2024-10-09T02:47:07.518613585Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 9 02:47:07.732808 kubelet[2877]: I1009 02:47:07.732778 2877 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="ad94d7f1-a5ec-43ac-936a-96d757e957c5" path="/var/lib/kubelet/pods/ad94d7f1-a5ec-43ac-936a-96d757e957c5/volumes" Oct 9 02:47:07.944341 containerd[1509]: time="2024-10-09T02:47:07.944295753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.28.1\"" Oct 9 02:47:08.730291 kubelet[2877]: E1009 02:47:08.730235 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:10.730794 kubelet[2877]: E1009 02:47:10.730743 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:11.556166 kubelet[2877]: I1009 02:47:11.556088 2877 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 9 02:47:12.528585 containerd[1509]: time="2024-10-09T02:47:12.528527196Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:12.529784 containerd[1509]: time="2024-10-09T02:47:12.529549731Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.28.1: active requests=0, bytes read=93083736" Oct 9 02:47:12.530643 containerd[1509]: time="2024-10-09T02:47:12.530570744Z" level=info msg="ImageCreate event name:\"sha256:f6d76a1259a8c22fd1c603577ee5bb8109bc40f2b3d0536d39160a027ffe9bab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:12.533078 containerd[1509]: time="2024-10-09T02:47:12.533031102Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:1cf32b2159ec9f938e747b82b9b7c74e26e17eb220e002a6a1bd6b5b1266e1fa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:12.534251 containerd[1509]: time="2024-10-09T02:47:12.533868386Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.28.1\" with image id \"sha256:f6d76a1259a8c22fd1c603577ee5bb8109bc40f2b3d0536d39160a027ffe9bab\", repo tag \"ghcr.io/flatcar/calico/cni:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:1cf32b2159ec9f938e747b82b9b7c74e26e17eb220e002a6a1bd6b5b1266e1fa\", size \"94576137\" in 4.589524092s" Oct 9 02:47:12.534251 containerd[1509]: time="2024-10-09T02:47:12.533903252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.28.1\" returns image reference \"sha256:f6d76a1259a8c22fd1c603577ee5bb8109bc40f2b3d0536d39160a027ffe9bab\"" Oct 9 02:47:12.535788 containerd[1509]: time="2024-10-09T02:47:12.535664587Z" level=info msg="CreateContainer within sandbox \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Oct 9 02:47:12.559721 containerd[1509]: time="2024-10-09T02:47:12.559559718Z" level=info msg="CreateContainer within sandbox \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b\"" Oct 9 02:47:12.561464 containerd[1509]: time="2024-10-09T02:47:12.560717349Z" level=info msg="StartContainer for \"6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b\"" Oct 9 02:47:12.626664 systemd[1]: Started cri-containerd-6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b.scope - libcontainer container 6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b. Oct 9 02:47:12.668788 containerd[1509]: time="2024-10-09T02:47:12.668334545Z" level=info msg="StartContainer for \"6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b\" returns successfully" Oct 9 02:47:12.730740 kubelet[2877]: E1009 02:47:12.730692 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:13.069003 systemd[1]: cri-containerd-6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b.scope: Deactivated successfully. Oct 9 02:47:13.101336 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b-rootfs.mount: Deactivated successfully. Oct 9 02:47:13.127301 kubelet[2877]: I1009 02:47:13.127267 2877 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Oct 9 02:47:13.145575 containerd[1509]: time="2024-10-09T02:47:13.145351367Z" level=info msg="shim disconnected" id=6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b namespace=k8s.io Oct 9 02:47:13.145575 containerd[1509]: time="2024-10-09T02:47:13.145553599Z" level=warning msg="cleaning up after shim disconnected" id=6cce4a023fd7ba2a59a88e3585bd27394ad5fc4fd3eb4212c997e8b76e794b9b namespace=k8s.io Oct 9 02:47:13.145575 containerd[1509]: time="2024-10-09T02:47:13.145566743Z" level=info msg="cleaning up dead shim" namespace=k8s.io Oct 9 02:47:13.164421 kubelet[2877]: I1009 02:47:13.164123 2877 topology_manager.go:215] "Topology Admit Handler" podUID="32c2fe78-6ebe-4ea3-8cb3-1becb914ffff" podNamespace="kube-system" podName="coredns-76f75df574-678dx" Oct 9 02:47:13.175179 kubelet[2877]: I1009 02:47:13.174558 2877 topology_manager.go:215] "Topology Admit Handler" podUID="7854dae5-93f3-4cca-b78b-56f3e9616784" podNamespace="kube-system" podName="coredns-76f75df574-glmdc" Oct 9 02:47:13.177619 kubelet[2877]: I1009 02:47:13.177221 2877 topology_manager.go:215] "Topology Admit Handler" podUID="f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e" podNamespace="calico-system" podName="calico-kube-controllers-6d4cd8d49c-l2sjn" Oct 9 02:47:13.184279 systemd[1]: Created slice kubepods-burstable-pod32c2fe78_6ebe_4ea3_8cb3_1becb914ffff.slice - libcontainer container kubepods-burstable-pod32c2fe78_6ebe_4ea3_8cb3_1becb914ffff.slice. Oct 9 02:47:13.199073 systemd[1]: Created slice kubepods-burstable-pod7854dae5_93f3_4cca_b78b_56f3e9616784.slice - libcontainer container kubepods-burstable-pod7854dae5_93f3_4cca_b78b_56f3e9616784.slice. Oct 9 02:47:13.205602 systemd[1]: Created slice kubepods-besteffort-podf5dd1ba5_411d_49c3_9c5f_0b47d4857d7e.slice - libcontainer container kubepods-besteffort-podf5dd1ba5_411d_49c3_9c5f_0b47d4857d7e.slice. Oct 9 02:47:13.277699 kubelet[2877]: I1009 02:47:13.277636 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e-tigera-ca-bundle\") pod \"calico-kube-controllers-6d4cd8d49c-l2sjn\" (UID: \"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e\") " pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" Oct 9 02:47:13.277903 kubelet[2877]: I1009 02:47:13.277749 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7854dae5-93f3-4cca-b78b-56f3e9616784-config-volume\") pod \"coredns-76f75df574-glmdc\" (UID: \"7854dae5-93f3-4cca-b78b-56f3e9616784\") " pod="kube-system/coredns-76f75df574-glmdc" Oct 9 02:47:13.277903 kubelet[2877]: I1009 02:47:13.277779 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32c2fe78-6ebe-4ea3-8cb3-1becb914ffff-config-volume\") pod \"coredns-76f75df574-678dx\" (UID: \"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff\") " pod="kube-system/coredns-76f75df574-678dx" Oct 9 02:47:13.277903 kubelet[2877]: I1009 02:47:13.277801 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnx27\" (UniqueName: \"kubernetes.io/projected/7854dae5-93f3-4cca-b78b-56f3e9616784-kube-api-access-pnx27\") pod \"coredns-76f75df574-glmdc\" (UID: \"7854dae5-93f3-4cca-b78b-56f3e9616784\") " pod="kube-system/coredns-76f75df574-glmdc" Oct 9 02:47:13.277903 kubelet[2877]: I1009 02:47:13.277828 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjsg5\" (UniqueName: \"kubernetes.io/projected/f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e-kube-api-access-xjsg5\") pod \"calico-kube-controllers-6d4cd8d49c-l2sjn\" (UID: \"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e\") " pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" Oct 9 02:47:13.277903 kubelet[2877]: I1009 02:47:13.277848 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwwpx\" (UniqueName: \"kubernetes.io/projected/32c2fe78-6ebe-4ea3-8cb3-1becb914ffff-kube-api-access-gwwpx\") pod \"coredns-76f75df574-678dx\" (UID: \"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff\") " pod="kube-system/coredns-76f75df574-678dx" Oct 9 02:47:13.490672 containerd[1509]: time="2024-10-09T02:47:13.490618208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-678dx,Uid:32c2fe78-6ebe-4ea3-8cb3-1becb914ffff,Namespace:kube-system,Attempt:0,}" Oct 9 02:47:13.504246 containerd[1509]: time="2024-10-09T02:47:13.504205973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-glmdc,Uid:7854dae5-93f3-4cca-b78b-56f3e9616784,Namespace:kube-system,Attempt:0,}" Oct 9 02:47:13.510830 containerd[1509]: time="2024-10-09T02:47:13.510774375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d4cd8d49c-l2sjn,Uid:f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e,Namespace:calico-system,Attempt:0,}" Oct 9 02:47:13.688463 containerd[1509]: time="2024-10-09T02:47:13.688373744Z" level=error msg="Failed to destroy network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.690973 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817-shm.mount: Deactivated successfully. Oct 9 02:47:13.692724 containerd[1509]: time="2024-10-09T02:47:13.692597678Z" level=error msg="encountered an error cleaning up failed sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.692801 containerd[1509]: time="2024-10-09T02:47:13.692680073Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d4cd8d49c-l2sjn,Uid:f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.693075 kubelet[2877]: E1009 02:47:13.693046 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.693339 kubelet[2877]: E1009 02:47:13.693320 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" Oct 9 02:47:13.693398 kubelet[2877]: E1009 02:47:13.693349 2877 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" Oct 9 02:47:13.693954 kubelet[2877]: E1009 02:47:13.693403 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6d4cd8d49c-l2sjn_calico-system(f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6d4cd8d49c-l2sjn_calico-system(f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" podUID="f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e" Oct 9 02:47:13.697549 containerd[1509]: time="2024-10-09T02:47:13.697497582Z" level=error msg="Failed to destroy network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.697650 containerd[1509]: time="2024-10-09T02:47:13.697622528Z" level=error msg="Failed to destroy network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.697973 containerd[1509]: time="2024-10-09T02:47:13.697925181Z" level=error msg="encountered an error cleaning up failed sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.697973 containerd[1509]: time="2024-10-09T02:47:13.697965888Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-678dx,Uid:32c2fe78-6ebe-4ea3-8cb3-1becb914ffff,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.700847 containerd[1509]: time="2024-10-09T02:47:13.700695173Z" level=error msg="encountered an error cleaning up failed sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.700847 containerd[1509]: time="2024-10-09T02:47:13.700730029Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-glmdc,Uid:7854dae5-93f3-4cca-b78b-56f3e9616784,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.700288 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934-shm.mount: Deactivated successfully. Oct 9 02:47:13.700978 kubelet[2877]: E1009 02:47:13.698114 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.700978 kubelet[2877]: E1009 02:47:13.698147 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-678dx" Oct 9 02:47:13.700978 kubelet[2877]: E1009 02:47:13.698166 2877 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-678dx" Oct 9 02:47:13.700392 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f-shm.mount: Deactivated successfully. Oct 9 02:47:13.701553 kubelet[2877]: E1009 02:47:13.698205 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-678dx_kube-system(32c2fe78-6ebe-4ea3-8cb3-1becb914ffff)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-678dx_kube-system(32c2fe78-6ebe-4ea3-8cb3-1becb914ffff)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-678dx" podUID="32c2fe78-6ebe-4ea3-8cb3-1becb914ffff" Oct 9 02:47:13.701553 kubelet[2877]: E1009 02:47:13.701121 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:13.701553 kubelet[2877]: E1009 02:47:13.701149 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-glmdc" Oct 9 02:47:13.702016 kubelet[2877]: E1009 02:47:13.701164 2877 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-glmdc" Oct 9 02:47:13.702016 kubelet[2877]: E1009 02:47:13.701200 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-glmdc_kube-system(7854dae5-93f3-4cca-b78b-56f3e9616784)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-glmdc_kube-system(7854dae5-93f3-4cca-b78b-56f3e9616784)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-glmdc" podUID="7854dae5-93f3-4cca-b78b-56f3e9616784" Oct 9 02:47:13.958088 kubelet[2877]: I1009 02:47:13.957975 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:13.960966 containerd[1509]: time="2024-10-09T02:47:13.959281345Z" level=info msg="StopPodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\"" Oct 9 02:47:13.960966 containerd[1509]: time="2024-10-09T02:47:13.960085328Z" level=info msg="Ensure that sandbox 2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f in task-service has been cleanup successfully" Oct 9 02:47:13.962000 kubelet[2877]: I1009 02:47:13.961719 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:13.963899 containerd[1509]: time="2024-10-09T02:47:13.963374282Z" level=info msg="StopPodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\"" Oct 9 02:47:13.963899 containerd[1509]: time="2024-10-09T02:47:13.963575061Z" level=info msg="Ensure that sandbox 7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934 in task-service has been cleanup successfully" Oct 9 02:47:13.976582 containerd[1509]: time="2024-10-09T02:47:13.976541381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.28.1\"" Oct 9 02:47:13.978243 kubelet[2877]: I1009 02:47:13.978215 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:13.980952 containerd[1509]: time="2024-10-09T02:47:13.980506696Z" level=info msg="StopPodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\"" Oct 9 02:47:13.980952 containerd[1509]: time="2024-10-09T02:47:13.980718637Z" level=info msg="Ensure that sandbox 0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817 in task-service has been cleanup successfully" Oct 9 02:47:14.026493 containerd[1509]: time="2024-10-09T02:47:14.026392781Z" level=error msg="StopPodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" failed" error="failed to destroy network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.026754 kubelet[2877]: E1009 02:47:14.026733 2877 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:14.026825 kubelet[2877]: E1009 02:47:14.026782 2877 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817"} Oct 9 02:47:14.026825 kubelet[2877]: E1009 02:47:14.026823 2877 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Oct 9 02:47:14.026904 kubelet[2877]: E1009 02:47:14.026851 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" podUID="f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e" Oct 9 02:47:14.034165 containerd[1509]: time="2024-10-09T02:47:14.034086278Z" level=error msg="StopPodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" failed" error="failed to destroy network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.034370 kubelet[2877]: E1009 02:47:14.034345 2877 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:14.034469 kubelet[2877]: E1009 02:47:14.034389 2877 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f"} Oct 9 02:47:14.034469 kubelet[2877]: E1009 02:47:14.034419 2877 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Oct 9 02:47:14.034469 kubelet[2877]: E1009 02:47:14.034460 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-678dx" podUID="32c2fe78-6ebe-4ea3-8cb3-1becb914ffff" Oct 9 02:47:14.034588 containerd[1509]: time="2024-10-09T02:47:14.034496906Z" level=error msg="StopPodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" failed" error="failed to destroy network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.034619 kubelet[2877]: E1009 02:47:14.034606 2877 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:14.034643 kubelet[2877]: E1009 02:47:14.034622 2877 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934"} Oct 9 02:47:14.034663 kubelet[2877]: E1009 02:47:14.034645 2877 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7854dae5-93f3-4cca-b78b-56f3e9616784\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Oct 9 02:47:14.034698 kubelet[2877]: E1009 02:47:14.034670 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7854dae5-93f3-4cca-b78b-56f3e9616784\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-glmdc" podUID="7854dae5-93f3-4cca-b78b-56f3e9616784" Oct 9 02:47:14.737614 systemd[1]: Created slice kubepods-besteffort-pod0cdabbef_c118_48a3_a126_6564dda90df7.slice - libcontainer container kubepods-besteffort-pod0cdabbef_c118_48a3_a126_6564dda90df7.slice. Oct 9 02:47:14.740260 containerd[1509]: time="2024-10-09T02:47:14.740213640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2g2jv,Uid:0cdabbef-c118-48a3-a126-6564dda90df7,Namespace:calico-system,Attempt:0,}" Oct 9 02:47:14.800610 containerd[1509]: time="2024-10-09T02:47:14.800535977Z" level=error msg="Failed to destroy network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.801002 containerd[1509]: time="2024-10-09T02:47:14.800951232Z" level=error msg="encountered an error cleaning up failed sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.801320 containerd[1509]: time="2024-10-09T02:47:14.801010184Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2g2jv,Uid:0cdabbef-c118-48a3-a126-6564dda90df7,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.803257 kubelet[2877]: E1009 02:47:14.801227 2877 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:14.803257 kubelet[2877]: E1009 02:47:14.801272 2877 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:14.803257 kubelet[2877]: E1009 02:47:14.801294 2877 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2g2jv" Oct 9 02:47:14.803372 kubelet[2877]: E1009 02:47:14.801342 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2g2jv_calico-system(0cdabbef-c118-48a3-a126-6564dda90df7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2g2jv_calico-system(0cdabbef-c118-48a3-a126-6564dda90df7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:14.804669 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c-shm.mount: Deactivated successfully. Oct 9 02:47:14.981487 kubelet[2877]: I1009 02:47:14.981456 2877 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:14.982635 containerd[1509]: time="2024-10-09T02:47:14.982026846Z" level=info msg="StopPodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\"" Oct 9 02:47:14.982635 containerd[1509]: time="2024-10-09T02:47:14.982237285Z" level=info msg="Ensure that sandbox c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c in task-service has been cleanup successfully" Oct 9 02:47:15.012004 containerd[1509]: time="2024-10-09T02:47:15.011782118Z" level=error msg="StopPodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" failed" error="failed to destroy network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 9 02:47:15.012425 kubelet[2877]: E1009 02:47:15.012374 2877 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:15.012629 kubelet[2877]: E1009 02:47:15.012467 2877 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c"} Oct 9 02:47:15.012629 kubelet[2877]: E1009 02:47:15.012514 2877 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0cdabbef-c118-48a3-a126-6564dda90df7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Oct 9 02:47:15.012629 kubelet[2877]: E1009 02:47:15.012549 2877 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0cdabbef-c118-48a3-a126-6564dda90df7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2g2jv" podUID="0cdabbef-c118-48a3-a126-6564dda90df7" Oct 9 02:47:20.188994 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3964639985.mount: Deactivated successfully. Oct 9 02:47:20.289404 containerd[1509]: time="2024-10-09T02:47:20.274197433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.28.1: active requests=0, bytes read=117873564" Oct 9 02:47:20.304217 containerd[1509]: time="2024-10-09T02:47:20.304160462Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:20.307226 containerd[1509]: time="2024-10-09T02:47:20.307186894Z" level=info msg="ImageCreate event name:\"sha256:8bbeb9e1ee3287b8f750c10383f53fa1ec6f942aaea2a900f666d5e4e63cf4cc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:20.326098 containerd[1509]: time="2024-10-09T02:47:20.326035439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:47908d8b3046dadd6fbea273ac5b0b9bb803cc7b58b9114c50bf7591767d2744\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:20.328308 containerd[1509]: time="2024-10-09T02:47:20.328274242Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.28.1\" with image id \"sha256:8bbeb9e1ee3287b8f750c10383f53fa1ec6f942aaea2a900f666d5e4e63cf4cc\", repo tag \"ghcr.io/flatcar/calico/node:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:47908d8b3046dadd6fbea273ac5b0b9bb803cc7b58b9114c50bf7591767d2744\", size \"117873426\" in 6.349809255s" Oct 9 02:47:20.328308 containerd[1509]: time="2024-10-09T02:47:20.328307625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.28.1\" returns image reference \"sha256:8bbeb9e1ee3287b8f750c10383f53fa1ec6f942aaea2a900f666d5e4e63cf4cc\"" Oct 9 02:47:20.381359 containerd[1509]: time="2024-10-09T02:47:20.381303461Z" level=info msg="CreateContainer within sandbox \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Oct 9 02:47:20.458824 containerd[1509]: time="2024-10-09T02:47:20.458700041Z" level=info msg="CreateContainer within sandbox \"8654eb3e076e531d0aa7ac853d9153faab99d24a8f12389b5b04943fc28c61a6\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e\"" Oct 9 02:47:20.463553 containerd[1509]: time="2024-10-09T02:47:20.463281512Z" level=info msg="StartContainer for \"b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e\"" Oct 9 02:47:20.547581 systemd[1]: Started cri-containerd-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e.scope - libcontainer container b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e. Oct 9 02:47:20.589158 containerd[1509]: time="2024-10-09T02:47:20.589056448Z" level=info msg="StartContainer for \"b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e\" returns successfully" Oct 9 02:47:20.670018 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Oct 9 02:47:20.670243 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Oct 9 02:47:21.124712 kubelet[2877]: I1009 02:47:21.123992 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-tk2k8" podStartSLOduration=2.706389456 podStartE2EDuration="15.091103537s" podCreationTimestamp="2024-10-09 02:47:06 +0000 UTC" firstStartedPulling="2024-10-09 02:47:07.943879555 +0000 UTC m=+26.335129999" lastFinishedPulling="2024-10-09 02:47:20.328593636 +0000 UTC m=+38.719844080" observedRunningTime="2024-10-09 02:47:21.087467434 +0000 UTC m=+39.478717889" watchObservedRunningTime="2024-10-09 02:47:21.091103537 +0000 UTC m=+39.482353982" Oct 9 02:47:22.344481 kernel: bpftool[4407]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Oct 9 02:47:22.578302 systemd-networkd[1402]: vxlan.calico: Link UP Oct 9 02:47:22.578312 systemd-networkd[1402]: vxlan.calico: Gained carrier Oct 9 02:47:23.704750 systemd-networkd[1402]: vxlan.calico: Gained IPv6LL Oct 9 02:47:26.733860 containerd[1509]: time="2024-10-09T02:47:26.733748645Z" level=info msg="StopPodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\"" Oct 9 02:47:26.735226 containerd[1509]: time="2024-10-09T02:47:26.734130036Z" level=info msg="StopPodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\"" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.808 [INFO][4512] k8s.go 608: Cleaning up netns ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.809 [INFO][4512] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" iface="eth0" netns="/var/run/netns/cni-93825b0c-8017-b37b-da31-8d26ec20450b" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.811 [INFO][4512] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" iface="eth0" netns="/var/run/netns/cni-93825b0c-8017-b37b-da31-8d26ec20450b" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.812 [INFO][4512] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" iface="eth0" netns="/var/run/netns/cni-93825b0c-8017-b37b-da31-8d26ec20450b" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.812 [INFO][4512] k8s.go 615: Releasing IP address(es) ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.812 [INFO][4512] utils.go 188: Calico CNI releasing IP address ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.950 [INFO][4523] ipam_plugin.go 417: Releasing address using handleID ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.951 [INFO][4523] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.952 [INFO][4523] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.958 [WARNING][4523] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.958 [INFO][4523] ipam_plugin.go 445: Releasing address using workloadID ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.960 [INFO][4523] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:26.963982 containerd[1509]: 2024-10-09 02:47:26.961 [INFO][4512] k8s.go 621: Teardown processing complete. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:26.968847 containerd[1509]: time="2024-10-09T02:47:26.968138578Z" level=info msg="TearDown network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" successfully" Oct 9 02:47:26.968847 containerd[1509]: time="2024-10-09T02:47:26.968175368Z" level=info msg="StopPodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" returns successfully" Oct 9 02:47:26.970621 containerd[1509]: time="2024-10-09T02:47:26.970595099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2g2jv,Uid:0cdabbef-c118-48a3-a126-6564dda90df7,Namespace:calico-system,Attempt:1,}" Oct 9 02:47:26.972061 systemd[1]: run-netns-cni\x2d93825b0c\x2d8017\x2db37b\x2dda31\x2d8d26ec20450b.mount: Deactivated successfully. Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.807 [INFO][4511] k8s.go 608: Cleaning up netns ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.809 [INFO][4511] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" iface="eth0" netns="/var/run/netns/cni-4904292b-2145-a354-69eb-0ef7469a5f8c" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.810 [INFO][4511] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" iface="eth0" netns="/var/run/netns/cni-4904292b-2145-a354-69eb-0ef7469a5f8c" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.813 [INFO][4511] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" iface="eth0" netns="/var/run/netns/cni-4904292b-2145-a354-69eb-0ef7469a5f8c" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.813 [INFO][4511] k8s.go 615: Releasing IP address(es) ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.813 [INFO][4511] utils.go 188: Calico CNI releasing IP address ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.950 [INFO][4524] ipam_plugin.go 417: Releasing address using handleID ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.951 [INFO][4524] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.960 [INFO][4524] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.964 [WARNING][4524] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.964 [INFO][4524] ipam_plugin.go 445: Releasing address using workloadID ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.966 [INFO][4524] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:26.975566 containerd[1509]: 2024-10-09 02:47:26.971 [INFO][4511] k8s.go 621: Teardown processing complete. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:26.975566 containerd[1509]: time="2024-10-09T02:47:26.974764597Z" level=info msg="TearDown network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" successfully" Oct 9 02:47:26.975566 containerd[1509]: time="2024-10-09T02:47:26.974779285Z" level=info msg="StopPodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" returns successfully" Oct 9 02:47:26.976386 containerd[1509]: time="2024-10-09T02:47:26.975630162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-glmdc,Uid:7854dae5-93f3-4cca-b78b-56f3e9616784,Namespace:kube-system,Attempt:1,}" Oct 9 02:47:26.978004 systemd[1]: run-netns-cni\x2d4904292b\x2d2145\x2da354\x2d69eb\x2d0ef7469a5f8c.mount: Deactivated successfully. Oct 9 02:47:27.117923 systemd-networkd[1402]: cali8f315c579b4: Link UP Oct 9 02:47:27.119520 systemd-networkd[1402]: cali8f315c579b4: Gained carrier Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.043 [INFO][4535] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0 csi-node-driver- calico-system 0cdabbef-c118-48a3-a126-6564dda90df7 760 0 2024-10-09 02:47:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78cd84fb8c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s ci-4116-0-0-c-ec98df32e3 csi-node-driver-2g2jv eth0 default [] [] [kns.calico-system ksa.calico-system.default] cali8f315c579b4 [] []}} ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.043 [INFO][4535] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.078 [INFO][4559] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" HandleID="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.085 [INFO][4559] ipam_plugin.go 270: Auto assigning IP ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" HandleID="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001fd8c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4116-0-0-c-ec98df32e3", "pod":"csi-node-driver-2g2jv", "timestamp":"2024-10-09 02:47:27.077986458 +0000 UTC"}, Hostname:"ci-4116-0-0-c-ec98df32e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.085 [INFO][4559] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.085 [INFO][4559] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.085 [INFO][4559] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4116-0-0-c-ec98df32e3' Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.088 [INFO][4559] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.094 [INFO][4559] ipam.go 372: Looking up existing affinities for host host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.097 [INFO][4559] ipam.go 489: Trying affinity for 192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.098 [INFO][4559] ipam.go 155: Attempting to load block cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.099 [INFO][4559] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.099 [INFO][4559] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.192/26 handle="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.100 [INFO][4559] ipam.go 1685: Creating new handle: k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3 Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.103 [INFO][4559] ipam.go 1203: Writing block in order to claim IPs block=192.168.96.192/26 handle="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.110 [INFO][4559] ipam.go 1216: Successfully claimed IPs: [192.168.96.193/26] block=192.168.96.192/26 handle="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.110 [INFO][4559] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.193/26] handle="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.110 [INFO][4559] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:27.138605 containerd[1509]: 2024-10-09 02:47:27.110 [INFO][4559] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.96.193/26] IPv6=[] ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" HandleID="k8s-pod-network.ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.139993 containerd[1509]: 2024-10-09 02:47:27.112 [INFO][4535] k8s.go 386: Populated endpoint ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0cdabbef-c118-48a3-a126-6564dda90df7", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"", Pod:"csi-node-driver-2g2jv", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.96.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali8f315c579b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:27.139993 containerd[1509]: 2024-10-09 02:47:27.113 [INFO][4535] k8s.go 387: Calico CNI using IPs: [192.168.96.193/32] ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.139993 containerd[1509]: 2024-10-09 02:47:27.113 [INFO][4535] dataplane_linux.go 68: Setting the host side veth name to cali8f315c579b4 ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.139993 containerd[1509]: 2024-10-09 02:47:27.119 [INFO][4535] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.139993 containerd[1509]: 2024-10-09 02:47:27.120 [INFO][4535] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0cdabbef-c118-48a3-a126-6564dda90df7", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3", Pod:"csi-node-driver-2g2jv", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.96.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali8f315c579b4", MAC:"fe:02:4d:d1:24:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:27.139993 containerd[1509]: 2024-10-09 02:47:27.135 [INFO][4535] k8s.go 500: Wrote updated endpoint to datastore ContainerID="ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3" Namespace="calico-system" Pod="csi-node-driver-2g2jv" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:27.160406 systemd-networkd[1402]: calif0bd0cc3018: Link UP Oct 9 02:47:27.160785 systemd-networkd[1402]: calif0bd0cc3018: Gained carrier Oct 9 02:47:27.182769 containerd[1509]: time="2024-10-09T02:47:27.182694081Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:27.182885 containerd[1509]: time="2024-10-09T02:47:27.182811723Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:27.183017 containerd[1509]: time="2024-10-09T02:47:27.182972778Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:27.183188 containerd[1509]: time="2024-10-09T02:47:27.183087634Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.049 [INFO][4541] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0 coredns-76f75df574- kube-system 7854dae5-93f3-4cca-b78b-56f3e9616784 759 0 2024-10-09 02:46:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4116-0-0-c-ec98df32e3 coredns-76f75df574-glmdc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif0bd0cc3018 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.049 [INFO][4541] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.079 [INFO][4560] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" HandleID="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.087 [INFO][4560] ipam_plugin.go 270: Auto assigning IP ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" HandleID="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318400), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4116-0-0-c-ec98df32e3", "pod":"coredns-76f75df574-glmdc", "timestamp":"2024-10-09 02:47:27.079743938 +0000 UTC"}, Hostname:"ci-4116-0-0-c-ec98df32e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.088 [INFO][4560] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.110 [INFO][4560] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.110 [INFO][4560] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4116-0-0-c-ec98df32e3' Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.112 [INFO][4560] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.120 [INFO][4560] ipam.go 372: Looking up existing affinities for host host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.123 [INFO][4560] ipam.go 489: Trying affinity for 192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.127 [INFO][4560] ipam.go 155: Attempting to load block cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.130 [INFO][4560] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.130 [INFO][4560] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.192/26 handle="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.132 [INFO][4560] ipam.go 1685: Creating new handle: k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491 Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.140 [INFO][4560] ipam.go 1203: Writing block in order to claim IPs block=192.168.96.192/26 handle="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.148 [INFO][4560] ipam.go 1216: Successfully claimed IPs: [192.168.96.194/26] block=192.168.96.192/26 handle="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.148 [INFO][4560] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.194/26] handle="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.148 [INFO][4560] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:27.184238 containerd[1509]: 2024-10-09 02:47:27.148 [INFO][4560] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.96.194/26] IPv6=[] ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" HandleID="k8s-pod-network.a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.185614 containerd[1509]: 2024-10-09 02:47:27.154 [INFO][4541] k8s.go 386: Populated endpoint ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"7854dae5-93f3-4cca-b78b-56f3e9616784", ResourceVersion:"759", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"", Pod:"coredns-76f75df574-glmdc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0bd0cc3018", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:27.185614 containerd[1509]: 2024-10-09 02:47:27.155 [INFO][4541] k8s.go 387: Calico CNI using IPs: [192.168.96.194/32] ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.185614 containerd[1509]: 2024-10-09 02:47:27.155 [INFO][4541] dataplane_linux.go 68: Setting the host side veth name to calif0bd0cc3018 ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.185614 containerd[1509]: 2024-10-09 02:47:27.162 [INFO][4541] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.185614 containerd[1509]: 2024-10-09 02:47:27.162 [INFO][4541] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"7854dae5-93f3-4cca-b78b-56f3e9616784", ResourceVersion:"759", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491", Pod:"coredns-76f75df574-glmdc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0bd0cc3018", MAC:"e2:f0:7e:9d:a5:6a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:27.185614 containerd[1509]: 2024-10-09 02:47:27.174 [INFO][4541] k8s.go 500: Wrote updated endpoint to datastore ContainerID="a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491" Namespace="kube-system" Pod="coredns-76f75df574-glmdc" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:27.210636 systemd[1]: Started cri-containerd-ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3.scope - libcontainer container ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3. Oct 9 02:47:27.244298 containerd[1509]: time="2024-10-09T02:47:27.242176700Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:27.244298 containerd[1509]: time="2024-10-09T02:47:27.242228738Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:27.244298 containerd[1509]: time="2024-10-09T02:47:27.242267291Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:27.244298 containerd[1509]: time="2024-10-09T02:47:27.242579441Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:27.256628 containerd[1509]: time="2024-10-09T02:47:27.256595869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2g2jv,Uid:0cdabbef-c118-48a3-a126-6564dda90df7,Namespace:calico-system,Attempt:1,} returns sandbox id \"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3\"" Oct 9 02:47:27.258958 containerd[1509]: time="2024-10-09T02:47:27.258938384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.28.1\"" Oct 9 02:47:27.271566 systemd[1]: Started cri-containerd-a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491.scope - libcontainer container a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491. Oct 9 02:47:27.307621 containerd[1509]: time="2024-10-09T02:47:27.307570966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-glmdc,Uid:7854dae5-93f3-4cca-b78b-56f3e9616784,Namespace:kube-system,Attempt:1,} returns sandbox id \"a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491\"" Oct 9 02:47:27.311099 containerd[1509]: time="2024-10-09T02:47:27.311001867Z" level=info msg="CreateContainer within sandbox \"a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 9 02:47:27.327993 containerd[1509]: time="2024-10-09T02:47:27.327920228Z" level=info msg="CreateContainer within sandbox \"a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ceba0e9adade4f6f24110eac9691c5db6bdd92624a57e1d26d6052fd77c4dac5\"" Oct 9 02:47:27.328874 containerd[1509]: time="2024-10-09T02:47:27.328275128Z" level=info msg="StartContainer for \"ceba0e9adade4f6f24110eac9691c5db6bdd92624a57e1d26d6052fd77c4dac5\"" Oct 9 02:47:27.351571 systemd[1]: Started cri-containerd-ceba0e9adade4f6f24110eac9691c5db6bdd92624a57e1d26d6052fd77c4dac5.scope - libcontainer container ceba0e9adade4f6f24110eac9691c5db6bdd92624a57e1d26d6052fd77c4dac5. Oct 9 02:47:27.378890 containerd[1509]: time="2024-10-09T02:47:27.378762213Z" level=info msg="StartContainer for \"ceba0e9adade4f6f24110eac9691c5db6bdd92624a57e1d26d6052fd77c4dac5\" returns successfully" Oct 9 02:47:27.731578 containerd[1509]: time="2024-10-09T02:47:27.731530868Z" level=info msg="StopPodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\"" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.776 [INFO][4731] k8s.go 608: Cleaning up netns ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.776 [INFO][4731] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" iface="eth0" netns="/var/run/netns/cni-cbcc7146-572f-be98-d123-4e21cefbd5d4" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.777 [INFO][4731] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" iface="eth0" netns="/var/run/netns/cni-cbcc7146-572f-be98-d123-4e21cefbd5d4" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.777 [INFO][4731] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" iface="eth0" netns="/var/run/netns/cni-cbcc7146-572f-be98-d123-4e21cefbd5d4" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.777 [INFO][4731] k8s.go 615: Releasing IP address(es) ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.777 [INFO][4731] utils.go 188: Calico CNI releasing IP address ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.798 [INFO][4737] ipam_plugin.go 417: Releasing address using handleID ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.798 [INFO][4737] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.798 [INFO][4737] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.803 [WARNING][4737] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.803 [INFO][4737] ipam_plugin.go 445: Releasing address using workloadID ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.804 [INFO][4737] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:27.808344 containerd[1509]: 2024-10-09 02:47:27.806 [INFO][4731] k8s.go 621: Teardown processing complete. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:27.809095 containerd[1509]: time="2024-10-09T02:47:27.808655253Z" level=info msg="TearDown network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" successfully" Oct 9 02:47:27.809095 containerd[1509]: time="2024-10-09T02:47:27.808677025Z" level=info msg="StopPodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" returns successfully" Oct 9 02:47:27.809276 containerd[1509]: time="2024-10-09T02:47:27.809253403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d4cd8d49c-l2sjn,Uid:f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e,Namespace:calico-system,Attempt:1,}" Oct 9 02:47:27.915802 systemd-networkd[1402]: calidd72d92c6e5: Link UP Oct 9 02:47:27.916594 systemd-networkd[1402]: calidd72d92c6e5: Gained carrier Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.847 [INFO][4744] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0 calico-kube-controllers-6d4cd8d49c- calico-system f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e 773 0 2024-10-09 02:47:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6d4cd8d49c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4116-0-0-c-ec98df32e3 calico-kube-controllers-6d4cd8d49c-l2sjn eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidd72d92c6e5 [] []}} ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.847 [INFO][4744] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.876 [INFO][4757] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" HandleID="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.883 [INFO][4757] ipam_plugin.go 270: Auto assigning IP ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" HandleID="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000285380), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4116-0-0-c-ec98df32e3", "pod":"calico-kube-controllers-6d4cd8d49c-l2sjn", "timestamp":"2024-10-09 02:47:27.876020804 +0000 UTC"}, Hostname:"ci-4116-0-0-c-ec98df32e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.883 [INFO][4757] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.883 [INFO][4757] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.883 [INFO][4757] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4116-0-0-c-ec98df32e3' Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.884 [INFO][4757] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.888 [INFO][4757] ipam.go 372: Looking up existing affinities for host host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.892 [INFO][4757] ipam.go 489: Trying affinity for 192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.893 [INFO][4757] ipam.go 155: Attempting to load block cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.896 [INFO][4757] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.896 [INFO][4757] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.192/26 handle="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.899 [INFO][4757] ipam.go 1685: Creating new handle: k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479 Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.904 [INFO][4757] ipam.go 1203: Writing block in order to claim IPs block=192.168.96.192/26 handle="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.910 [INFO][4757] ipam.go 1216: Successfully claimed IPs: [192.168.96.195/26] block=192.168.96.192/26 handle="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.910 [INFO][4757] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.195/26] handle="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.910 [INFO][4757] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:27.932242 containerd[1509]: 2024-10-09 02:47:27.910 [INFO][4757] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.96.195/26] IPv6=[] ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" HandleID="k8s-pod-network.a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.938059 containerd[1509]: 2024-10-09 02:47:27.913 [INFO][4744] k8s.go 386: Populated endpoint ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0", GenerateName:"calico-kube-controllers-6d4cd8d49c-", Namespace:"calico-system", SelfLink:"", UID:"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d4cd8d49c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"", Pod:"calico-kube-controllers-6d4cd8d49c-l2sjn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidd72d92c6e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:27.938059 containerd[1509]: 2024-10-09 02:47:27.913 [INFO][4744] k8s.go 387: Calico CNI using IPs: [192.168.96.195/32] ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.938059 containerd[1509]: 2024-10-09 02:47:27.913 [INFO][4744] dataplane_linux.go 68: Setting the host side veth name to calidd72d92c6e5 ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.938059 containerd[1509]: 2024-10-09 02:47:27.917 [INFO][4744] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.938059 containerd[1509]: 2024-10-09 02:47:27.917 [INFO][4744] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0", GenerateName:"calico-kube-controllers-6d4cd8d49c-", Namespace:"calico-system", SelfLink:"", UID:"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d4cd8d49c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479", Pod:"calico-kube-controllers-6d4cd8d49c-l2sjn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidd72d92c6e5", MAC:"fe:dd:15:15:c8:b7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:27.938059 containerd[1509]: 2024-10-09 02:47:27.924 [INFO][4744] k8s.go 500: Wrote updated endpoint to datastore ContainerID="a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479" Namespace="calico-system" Pod="calico-kube-controllers-6d4cd8d49c-l2sjn" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:27.976568 systemd[1]: run-netns-cni\x2dcbcc7146\x2d572f\x2dbe98\x2dd123\x2d4e21cefbd5d4.mount: Deactivated successfully. Oct 9 02:47:28.001010 containerd[1509]: time="2024-10-09T02:47:28.000818678Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:28.001010 containerd[1509]: time="2024-10-09T02:47:28.000959634Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:28.001165 containerd[1509]: time="2024-10-09T02:47:28.001021492Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:28.001457 containerd[1509]: time="2024-10-09T02:47:28.001226770Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:28.022163 systemd[1]: run-containerd-runc-k8s.io-a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479-runc.TdzpYJ.mount: Deactivated successfully. Oct 9 02:47:28.028601 systemd[1]: Started cri-containerd-a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479.scope - libcontainer container a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479. Oct 9 02:47:28.074251 containerd[1509]: time="2024-10-09T02:47:28.074216049Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d4cd8d49c-l2sjn,Uid:f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e,Namespace:calico-system,Attempt:1,} returns sandbox id \"a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479\"" Oct 9 02:47:28.092849 kubelet[2877]: I1009 02:47:28.092815 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-glmdc" podStartSLOduration=33.092779154 podStartE2EDuration="33.092779154s" podCreationTimestamp="2024-10-09 02:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:47:28.091907226 +0000 UTC m=+46.483157692" watchObservedRunningTime="2024-10-09 02:47:28.092779154 +0000 UTC m=+46.484029599" Oct 9 02:47:28.730804 containerd[1509]: time="2024-10-09T02:47:28.730273268Z" level=info msg="StopPodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\"" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.792 [INFO][4851] k8s.go 608: Cleaning up netns ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.793 [INFO][4851] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" iface="eth0" netns="/var/run/netns/cni-ebba7b5b-d03c-c529-ddd2-0fa64258f20c" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.794 [INFO][4851] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" iface="eth0" netns="/var/run/netns/cni-ebba7b5b-d03c-c529-ddd2-0fa64258f20c" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.794 [INFO][4851] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" iface="eth0" netns="/var/run/netns/cni-ebba7b5b-d03c-c529-ddd2-0fa64258f20c" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.795 [INFO][4851] k8s.go 615: Releasing IP address(es) ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.795 [INFO][4851] utils.go 188: Calico CNI releasing IP address ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.832 [INFO][4862] ipam_plugin.go 417: Releasing address using handleID ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.832 [INFO][4862] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.832 [INFO][4862] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.837 [WARNING][4862] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.837 [INFO][4862] ipam_plugin.go 445: Releasing address using workloadID ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.839 [INFO][4862] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:28.843760 containerd[1509]: 2024-10-09 02:47:28.841 [INFO][4851] k8s.go 621: Teardown processing complete. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:28.846882 containerd[1509]: time="2024-10-09T02:47:28.844477530Z" level=info msg="TearDown network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" successfully" Oct 9 02:47:28.846882 containerd[1509]: time="2024-10-09T02:47:28.844503469Z" level=info msg="StopPodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" returns successfully" Oct 9 02:47:28.847474 containerd[1509]: time="2024-10-09T02:47:28.847455766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-678dx,Uid:32c2fe78-6ebe-4ea3-8cb3-1becb914ffff,Namespace:kube-system,Attempt:1,}" Oct 9 02:47:28.848983 systemd[1]: run-netns-cni\x2debba7b5b\x2dd03c\x2dc529\x2dddd2\x2d0fa64258f20c.mount: Deactivated successfully. Oct 9 02:47:28.888661 systemd-networkd[1402]: cali8f315c579b4: Gained IPv6LL Oct 9 02:47:28.904117 containerd[1509]: time="2024-10-09T02:47:28.904080534Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:28.905617 containerd[1509]: time="2024-10-09T02:47:28.905584655Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.28.1: active requests=0, bytes read=7642081" Oct 9 02:47:28.906607 containerd[1509]: time="2024-10-09T02:47:28.906560328Z" level=info msg="ImageCreate event name:\"sha256:d0c7782dfd1af19483b1da01b3d6692a92c2a570a3c8c6059128fda84c838a61\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:28.917912 containerd[1509]: time="2024-10-09T02:47:28.916835778Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:01e16d03dd0c29a8e1e302455eb15c2d0326c49cbaca4bbe8dc0e2d5308c5add\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:28.918099 containerd[1509]: time="2024-10-09T02:47:28.917796963Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.28.1\" with image id \"sha256:d0c7782dfd1af19483b1da01b3d6692a92c2a570a3c8c6059128fda84c838a61\", repo tag \"ghcr.io/flatcar/calico/csi:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:01e16d03dd0c29a8e1e302455eb15c2d0326c49cbaca4bbe8dc0e2d5308c5add\", size \"9134482\" in 1.658669933s" Oct 9 02:47:28.918345 containerd[1509]: time="2024-10-09T02:47:28.918228279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.28.1\" returns image reference \"sha256:d0c7782dfd1af19483b1da01b3d6692a92c2a570a3c8c6059128fda84c838a61\"" Oct 9 02:47:28.921462 containerd[1509]: time="2024-10-09T02:47:28.920160157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\"" Oct 9 02:47:28.925492 containerd[1509]: time="2024-10-09T02:47:28.925409483Z" level=info msg="CreateContainer within sandbox \"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Oct 9 02:47:28.952710 systemd-networkd[1402]: calif0bd0cc3018: Gained IPv6LL Oct 9 02:47:28.957894 containerd[1509]: time="2024-10-09T02:47:28.957177629Z" level=info msg="CreateContainer within sandbox \"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"40570a24d093595eced8a992683ffbb725d5223204429e7cc1fe4d514eb3a9cd\"" Oct 9 02:47:28.959017 containerd[1509]: time="2024-10-09T02:47:28.958904673Z" level=info msg="StartContainer for \"40570a24d093595eced8a992683ffbb725d5223204429e7cc1fe4d514eb3a9cd\"" Oct 9 02:47:28.985105 systemd-networkd[1402]: cali3259406dbce: Link UP Oct 9 02:47:28.986771 systemd-networkd[1402]: cali3259406dbce: Gained carrier Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.902 [INFO][4870] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0 coredns-76f75df574- kube-system 32c2fe78-6ebe-4ea3-8cb3-1becb914ffff 789 0 2024-10-09 02:46:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4116-0-0-c-ec98df32e3 coredns-76f75df574-678dx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3259406dbce [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.902 [INFO][4870] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.933 [INFO][4882] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" HandleID="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.942 [INFO][4882] ipam_plugin.go 270: Auto assigning IP ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" HandleID="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ede00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4116-0-0-c-ec98df32e3", "pod":"coredns-76f75df574-678dx", "timestamp":"2024-10-09 02:47:28.933102775 +0000 UTC"}, Hostname:"ci-4116-0-0-c-ec98df32e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.942 [INFO][4882] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.942 [INFO][4882] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.942 [INFO][4882] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4116-0-0-c-ec98df32e3' Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.944 [INFO][4882] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.947 [INFO][4882] ipam.go 372: Looking up existing affinities for host host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.950 [INFO][4882] ipam.go 489: Trying affinity for 192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.951 [INFO][4882] ipam.go 155: Attempting to load block cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.953 [INFO][4882] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.953 [INFO][4882] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.192/26 handle="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.957 [INFO][4882] ipam.go 1685: Creating new handle: k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8 Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.961 [INFO][4882] ipam.go 1203: Writing block in order to claim IPs block=192.168.96.192/26 handle="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.967 [INFO][4882] ipam.go 1216: Successfully claimed IPs: [192.168.96.196/26] block=192.168.96.192/26 handle="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.967 [INFO][4882] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.196/26] handle="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.967 [INFO][4882] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:29.020490 containerd[1509]: 2024-10-09 02:47:28.967 [INFO][4882] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.96.196/26] IPv6=[] ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" HandleID="k8s-pod-network.11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.023998 containerd[1509]: 2024-10-09 02:47:28.977 [INFO][4870] k8s.go 386: Populated endpoint ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"", Pod:"coredns-76f75df574-678dx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3259406dbce", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:29.023998 containerd[1509]: 2024-10-09 02:47:28.977 [INFO][4870] k8s.go 387: Calico CNI using IPs: [192.168.96.196/32] ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.023998 containerd[1509]: 2024-10-09 02:47:28.977 [INFO][4870] dataplane_linux.go 68: Setting the host side veth name to cali3259406dbce ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.023998 containerd[1509]: 2024-10-09 02:47:28.986 [INFO][4870] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.023998 containerd[1509]: 2024-10-09 02:47:28.986 [INFO][4870] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8", Pod:"coredns-76f75df574-678dx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3259406dbce", MAC:"22:0e:cc:93:23:06", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:29.023998 containerd[1509]: 2024-10-09 02:47:29.005 [INFO][4870] k8s.go 500: Wrote updated endpoint to datastore ContainerID="11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8" Namespace="kube-system" Pod="coredns-76f75df574-678dx" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:29.021278 systemd[1]: run-containerd-runc-k8s.io-40570a24d093595eced8a992683ffbb725d5223204429e7cc1fe4d514eb3a9cd-runc.O9AqKt.mount: Deactivated successfully. Oct 9 02:47:29.034894 systemd[1]: Started cri-containerd-40570a24d093595eced8a992683ffbb725d5223204429e7cc1fe4d514eb3a9cd.scope - libcontainer container 40570a24d093595eced8a992683ffbb725d5223204429e7cc1fe4d514eb3a9cd. Oct 9 02:47:29.062811 containerd[1509]: time="2024-10-09T02:47:29.062566397Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:29.063291 containerd[1509]: time="2024-10-09T02:47:29.063102510Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:29.063291 containerd[1509]: time="2024-10-09T02:47:29.063119011Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:29.063291 containerd[1509]: time="2024-10-09T02:47:29.063183132Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:29.081553 systemd[1]: Started cri-containerd-11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8.scope - libcontainer container 11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8. Oct 9 02:47:29.111905 containerd[1509]: time="2024-10-09T02:47:29.111861902Z" level=info msg="StartContainer for \"40570a24d093595eced8a992683ffbb725d5223204429e7cc1fe4d514eb3a9cd\" returns successfully" Oct 9 02:47:29.133839 containerd[1509]: time="2024-10-09T02:47:29.133779919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-678dx,Uid:32c2fe78-6ebe-4ea3-8cb3-1becb914ffff,Namespace:kube-system,Attempt:1,} returns sandbox id \"11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8\"" Oct 9 02:47:29.136571 containerd[1509]: time="2024-10-09T02:47:29.136542818Z" level=info msg="CreateContainer within sandbox \"11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 9 02:47:29.149240 containerd[1509]: time="2024-10-09T02:47:29.149208539Z" level=info msg="CreateContainer within sandbox \"11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"96ca2b86c9d7d828e57fb0bedcca26b465d54c5581ba7f83774b12e6c20a260e\"" Oct 9 02:47:29.150132 containerd[1509]: time="2024-10-09T02:47:29.150107808Z" level=info msg="StartContainer for \"96ca2b86c9d7d828e57fb0bedcca26b465d54c5581ba7f83774b12e6c20a260e\"" Oct 9 02:47:29.177691 systemd[1]: Started cri-containerd-96ca2b86c9d7d828e57fb0bedcca26b465d54c5581ba7f83774b12e6c20a260e.scope - libcontainer container 96ca2b86c9d7d828e57fb0bedcca26b465d54c5581ba7f83774b12e6c20a260e. Oct 9 02:47:29.204787 containerd[1509]: time="2024-10-09T02:47:29.204746274Z" level=info msg="StartContainer for \"96ca2b86c9d7d828e57fb0bedcca26b465d54c5581ba7f83774b12e6c20a260e\" returns successfully" Oct 9 02:47:29.464860 systemd-networkd[1402]: calidd72d92c6e5: Gained IPv6LL Oct 9 02:47:30.107769 kubelet[2877]: I1009 02:47:30.107549 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-678dx" podStartSLOduration=35.107505362 podStartE2EDuration="35.107505362s" podCreationTimestamp="2024-10-09 02:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-10-09 02:47:30.107207108 +0000 UTC m=+48.498457594" watchObservedRunningTime="2024-10-09 02:47:30.107505362 +0000 UTC m=+48.498755827" Oct 9 02:47:31.064664 systemd-networkd[1402]: cali3259406dbce: Gained IPv6LL Oct 9 02:47:31.440405 containerd[1509]: time="2024-10-09T02:47:31.440242967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:31.441922 containerd[1509]: time="2024-10-09T02:47:31.441880549Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.28.1: active requests=0, bytes read=33507125" Oct 9 02:47:31.442867 containerd[1509]: time="2024-10-09T02:47:31.442840071Z" level=info msg="ImageCreate event name:\"sha256:9d19dff735fa0889ad6e741790dd1ff35dc4443f14c95bd61459ff0b9162252e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:31.445015 containerd[1509]: time="2024-10-09T02:47:31.444736241Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:9a7338f7187d4d2352fe49eedee44b191ac92557a2e71aa3de3527ed85c1641b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:31.446098 containerd[1509]: time="2024-10-09T02:47:31.446072064Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\" with image id \"sha256:9d19dff735fa0889ad6e741790dd1ff35dc4443f14c95bd61459ff0b9162252e\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:9a7338f7187d4d2352fe49eedee44b191ac92557a2e71aa3de3527ed85c1641b\", size \"34999494\" in 2.525887671s" Oct 9 02:47:31.446544 containerd[1509]: time="2024-10-09T02:47:31.446100157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.28.1\" returns image reference \"sha256:9d19dff735fa0889ad6e741790dd1ff35dc4443f14c95bd61459ff0b9162252e\"" Oct 9 02:47:31.447266 containerd[1509]: time="2024-10-09T02:47:31.447124070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\"" Oct 9 02:47:31.466288 containerd[1509]: time="2024-10-09T02:47:31.464639276Z" level=info msg="CreateContainer within sandbox \"a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Oct 9 02:47:31.480406 containerd[1509]: time="2024-10-09T02:47:31.480371653Z" level=info msg="CreateContainer within sandbox \"a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9\"" Oct 9 02:47:31.481867 containerd[1509]: time="2024-10-09T02:47:31.481840207Z" level=info msg="StartContainer for \"a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9\"" Oct 9 02:47:31.519553 systemd[1]: Started cri-containerd-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9.scope - libcontainer container a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9. Oct 9 02:47:31.575175 containerd[1509]: time="2024-10-09T02:47:31.574752298Z" level=info msg="StartContainer for \"a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9\" returns successfully" Oct 9 02:47:32.126460 kubelet[2877]: I1009 02:47:32.124138 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6d4cd8d49c-l2sjn" podStartSLOduration=27.753600662 podStartE2EDuration="31.124088091s" podCreationTimestamp="2024-10-09 02:47:01 +0000 UTC" firstStartedPulling="2024-10-09 02:47:28.075964712 +0000 UTC m=+46.467215158" lastFinishedPulling="2024-10-09 02:47:31.446452142 +0000 UTC m=+49.837702587" observedRunningTime="2024-10-09 02:47:32.124010894 +0000 UTC m=+50.515261360" watchObservedRunningTime="2024-10-09 02:47:32.124088091 +0000 UTC m=+50.515338535" Oct 9 02:47:33.308171 containerd[1509]: time="2024-10-09T02:47:33.308082244Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:33.309462 containerd[1509]: time="2024-10-09T02:47:33.309196748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1: active requests=0, bytes read=12907822" Oct 9 02:47:33.310181 containerd[1509]: time="2024-10-09T02:47:33.310138305Z" level=info msg="ImageCreate event name:\"sha256:d1ca8f023879d2e9a9a7c98dbb3252886c5b7676be9529ddb5200aa2789b233e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:33.312637 containerd[1509]: time="2024-10-09T02:47:33.312329703Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:682cc97e4580d25b7314032c008a552bb05182fac34eba82cc389113c7767076\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:33.313458 containerd[1509]: time="2024-10-09T02:47:33.313393782Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\" with image id \"sha256:d1ca8f023879d2e9a9a7c98dbb3252886c5b7676be9529ddb5200aa2789b233e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:682cc97e4580d25b7314032c008a552bb05182fac34eba82cc389113c7767076\", size \"14400175\" in 1.866244754s" Oct 9 02:47:33.313520 containerd[1509]: time="2024-10-09T02:47:33.313465958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.1\" returns image reference \"sha256:d1ca8f023879d2e9a9a7c98dbb3252886c5b7676be9529ddb5200aa2789b233e\"" Oct 9 02:47:33.324034 containerd[1509]: time="2024-10-09T02:47:33.323303392Z" level=info msg="CreateContainer within sandbox \"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Oct 9 02:47:33.339081 containerd[1509]: time="2024-10-09T02:47:33.339049330Z" level=info msg="CreateContainer within sandbox \"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"117eb01c01096db393690175490033b6657e988cb3ac9db45f0117ae4f053c1a\"" Oct 9 02:47:33.339421 containerd[1509]: time="2024-10-09T02:47:33.339396476Z" level=info msg="StartContainer for \"117eb01c01096db393690175490033b6657e988cb3ac9db45f0117ae4f053c1a\"" Oct 9 02:47:33.373760 systemd[1]: Started cri-containerd-117eb01c01096db393690175490033b6657e988cb3ac9db45f0117ae4f053c1a.scope - libcontainer container 117eb01c01096db393690175490033b6657e988cb3ac9db45f0117ae4f053c1a. Oct 9 02:47:33.411261 containerd[1509]: time="2024-10-09T02:47:33.411180105Z" level=info msg="StartContainer for \"117eb01c01096db393690175490033b6657e988cb3ac9db45f0117ae4f053c1a\" returns successfully" Oct 9 02:47:33.925670 kubelet[2877]: I1009 02:47:33.925625 2877 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Oct 9 02:47:33.928888 kubelet[2877]: I1009 02:47:33.928860 2877 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Oct 9 02:47:41.734535 containerd[1509]: time="2024-10-09T02:47:41.734410173Z" level=info msg="StopPodSandbox for \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\"" Oct 9 02:47:41.734535 containerd[1509]: time="2024-10-09T02:47:41.734527133Z" level=info msg="TearDown network for sandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" successfully" Oct 9 02:47:41.734535 containerd[1509]: time="2024-10-09T02:47:41.734537473Z" level=info msg="StopPodSandbox for \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" returns successfully" Oct 9 02:47:41.736189 containerd[1509]: time="2024-10-09T02:47:41.736137863Z" level=info msg="RemovePodSandbox for \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\"" Oct 9 02:47:41.740021 containerd[1509]: time="2024-10-09T02:47:41.739729500Z" level=info msg="Forcibly stopping sandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\"" Oct 9 02:47:41.740021 containerd[1509]: time="2024-10-09T02:47:41.739810412Z" level=info msg="TearDown network for sandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" successfully" Oct 9 02:47:41.753820 containerd[1509]: time="2024-10-09T02:47:41.753764873Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Oct 9 02:47:41.753820 containerd[1509]: time="2024-10-09T02:47:41.753811441Z" level=info msg="RemovePodSandbox \"153a309771d186a5d26dfae5087b22fff40e32f712df6b355f41ac310852cd87\" returns successfully" Oct 9 02:47:41.754270 containerd[1509]: time="2024-10-09T02:47:41.754241523Z" level=info msg="StopPodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\"" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.792 [WARNING][5152] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0cdabbef-c118-48a3-a126-6564dda90df7", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3", Pod:"csi-node-driver-2g2jv", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.96.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali8f315c579b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.792 [INFO][5152] k8s.go 608: Cleaning up netns ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.792 [INFO][5152] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" iface="eth0" netns="" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.792 [INFO][5152] k8s.go 615: Releasing IP address(es) ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.792 [INFO][5152] utils.go 188: Calico CNI releasing IP address ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.818 [INFO][5158] ipam_plugin.go 417: Releasing address using handleID ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.818 [INFO][5158] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.818 [INFO][5158] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.823 [WARNING][5158] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.823 [INFO][5158] ipam_plugin.go 445: Releasing address using workloadID ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.824 [INFO][5158] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:41.831104 containerd[1509]: 2024-10-09 02:47:41.827 [INFO][5152] k8s.go 621: Teardown processing complete. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.832734 containerd[1509]: time="2024-10-09T02:47:41.831155630Z" level=info msg="TearDown network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" successfully" Oct 9 02:47:41.832734 containerd[1509]: time="2024-10-09T02:47:41.831175747Z" level=info msg="StopPodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" returns successfully" Oct 9 02:47:41.832734 containerd[1509]: time="2024-10-09T02:47:41.831692823Z" level=info msg="RemovePodSandbox for \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\"" Oct 9 02:47:41.832734 containerd[1509]: time="2024-10-09T02:47:41.831716458Z" level=info msg="Forcibly stopping sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\"" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.864 [WARNING][5176] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0cdabbef-c118-48a3-a126-6564dda90df7", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78cd84fb8c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"ffb4504e7d3b5628d1da232c76d3a36f817c9ae77c4700c8a664befad41dcbe3", Pod:"csi-node-driver-2g2jv", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.96.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali8f315c579b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.864 [INFO][5176] k8s.go 608: Cleaning up netns ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.864 [INFO][5176] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" iface="eth0" netns="" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.864 [INFO][5176] k8s.go 615: Releasing IP address(es) ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.864 [INFO][5176] utils.go 188: Calico CNI releasing IP address ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.882 [INFO][5182] ipam_plugin.go 417: Releasing address using handleID ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.882 [INFO][5182] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.882 [INFO][5182] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.886 [WARNING][5182] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.886 [INFO][5182] ipam_plugin.go 445: Releasing address using workloadID ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" HandleID="k8s-pod-network.c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Workload="ci--4116--0--0--c--ec98df32e3-k8s-csi--node--driver--2g2jv-eth0" Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.887 [INFO][5182] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:41.891611 containerd[1509]: 2024-10-09 02:47:41.889 [INFO][5176] k8s.go 621: Teardown processing complete. ContainerID="c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c" Oct 9 02:47:41.892000 containerd[1509]: time="2024-10-09T02:47:41.891624537Z" level=info msg="TearDown network for sandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" successfully" Oct 9 02:47:41.895091 containerd[1509]: time="2024-10-09T02:47:41.895065991Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Oct 9 02:47:41.895165 containerd[1509]: time="2024-10-09T02:47:41.895110945Z" level=info msg="RemovePodSandbox \"c3b8f204caecc3ee8a62b757fe71b244bb3c7cf0e67f4e2bd449d89584bd282c\" returns successfully" Oct 9 02:47:41.895517 containerd[1509]: time="2024-10-09T02:47:41.895499238Z" level=info msg="StopPodSandbox for \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\"" Oct 9 02:47:41.895607 containerd[1509]: time="2024-10-09T02:47:41.895585621Z" level=info msg="TearDown network for sandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" successfully" Oct 9 02:47:41.895607 containerd[1509]: time="2024-10-09T02:47:41.895598485Z" level=info msg="StopPodSandbox for \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" returns successfully" Oct 9 02:47:41.895875 containerd[1509]: time="2024-10-09T02:47:41.895858946Z" level=info msg="RemovePodSandbox for \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\"" Oct 9 02:47:41.895913 containerd[1509]: time="2024-10-09T02:47:41.895879425Z" level=info msg="Forcibly stopping sandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\"" Oct 9 02:47:41.895956 containerd[1509]: time="2024-10-09T02:47:41.895921344Z" level=info msg="TearDown network for sandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" successfully" Oct 9 02:47:41.898965 containerd[1509]: time="2024-10-09T02:47:41.898938929Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Oct 9 02:47:41.899476 containerd[1509]: time="2024-10-09T02:47:41.898970657Z" level=info msg="RemovePodSandbox \"cdb31533d40c873f4cb374de920a0c05324503c7eaf2edc31a15727945d8c93e\" returns successfully" Oct 9 02:47:41.899476 containerd[1509]: time="2024-10-09T02:47:41.899241629Z" level=info msg="StopPodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\"" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.928 [WARNING][5200] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"7854dae5-93f3-4cca-b78b-56f3e9616784", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491", Pod:"coredns-76f75df574-glmdc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0bd0cc3018", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.928 [INFO][5200] k8s.go 608: Cleaning up netns ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.928 [INFO][5200] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" iface="eth0" netns="" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.928 [INFO][5200] k8s.go 615: Releasing IP address(es) ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.928 [INFO][5200] utils.go 188: Calico CNI releasing IP address ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.946 [INFO][5206] ipam_plugin.go 417: Releasing address using handleID ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.946 [INFO][5206] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.946 [INFO][5206] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.950 [WARNING][5206] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.950 [INFO][5206] ipam_plugin.go 445: Releasing address using workloadID ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.951 [INFO][5206] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:41.955514 containerd[1509]: 2024-10-09 02:47:41.953 [INFO][5200] k8s.go 621: Teardown processing complete. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:41.956021 containerd[1509]: time="2024-10-09T02:47:41.955538244Z" level=info msg="TearDown network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" successfully" Oct 9 02:47:41.956021 containerd[1509]: time="2024-10-09T02:47:41.955579592Z" level=info msg="StopPodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" returns successfully" Oct 9 02:47:41.956021 containerd[1509]: time="2024-10-09T02:47:41.955991038Z" level=info msg="RemovePodSandbox for \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\"" Oct 9 02:47:41.956021 containerd[1509]: time="2024-10-09T02:47:41.956012419Z" level=info msg="Forcibly stopping sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\"" Oct 9 02:47:42.005885 systemd[1]: Started sshd@7-188.245.48.63:22-81.161.238.160:57000.service - OpenSSH per-connection server daemon (81.161.238.160:57000). Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:41.984 [WARNING][5224] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"7854dae5-93f3-4cca-b78b-56f3e9616784", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"a7ec2699abf02587bf4dcae684903ead18f7f2a115597bdbd562b5b8a56aa491", Pod:"coredns-76f75df574-glmdc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0bd0cc3018", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:41.984 [INFO][5224] k8s.go 608: Cleaning up netns ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:41.984 [INFO][5224] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" iface="eth0" netns="" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:41.984 [INFO][5224] k8s.go 615: Releasing IP address(es) ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:41.984 [INFO][5224] utils.go 188: Calico CNI releasing IP address ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.005 [INFO][5231] ipam_plugin.go 417: Releasing address using handleID ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.006 [INFO][5231] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.006 [INFO][5231] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.012 [WARNING][5231] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.012 [INFO][5231] ipam_plugin.go 445: Releasing address using workloadID ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" HandleID="k8s-pod-network.7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--glmdc-eth0" Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.016 [INFO][5231] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:42.022788 containerd[1509]: 2024-10-09 02:47:42.019 [INFO][5224] k8s.go 621: Teardown processing complete. ContainerID="7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934" Oct 9 02:47:42.022788 containerd[1509]: time="2024-10-09T02:47:42.022337363Z" level=info msg="TearDown network for sandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" successfully" Oct 9 02:47:42.026841 containerd[1509]: time="2024-10-09T02:47:42.026804300Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Oct 9 02:47:42.026841 containerd[1509]: time="2024-10-09T02:47:42.026853724Z" level=info msg="RemovePodSandbox \"7a051a840124801e6cefcb87561f595a730ae84df5d9f4f5c7854bafab59d934\" returns successfully" Oct 9 02:47:42.027885 containerd[1509]: time="2024-10-09T02:47:42.027752830Z" level=info msg="StopPodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\"" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.062 [WARNING][5250] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8", Pod:"coredns-76f75df574-678dx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3259406dbce", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.062 [INFO][5250] k8s.go 608: Cleaning up netns ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.062 [INFO][5250] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" iface="eth0" netns="" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.062 [INFO][5250] k8s.go 615: Releasing IP address(es) ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.062 [INFO][5250] utils.go 188: Calico CNI releasing IP address ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.090 [INFO][5258] ipam_plugin.go 417: Releasing address using handleID ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.090 [INFO][5258] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.090 [INFO][5258] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.097 [WARNING][5258] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.097 [INFO][5258] ipam_plugin.go 445: Releasing address using workloadID ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.098 [INFO][5258] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:42.102733 containerd[1509]: 2024-10-09 02:47:42.100 [INFO][5250] k8s.go 621: Teardown processing complete. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.103194 containerd[1509]: time="2024-10-09T02:47:42.103099781Z" level=info msg="TearDown network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" successfully" Oct 9 02:47:42.103194 containerd[1509]: time="2024-10-09T02:47:42.103125009Z" level=info msg="StopPodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" returns successfully" Oct 9 02:47:42.103541 containerd[1509]: time="2024-10-09T02:47:42.103503393Z" level=info msg="RemovePodSandbox for \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\"" Oct 9 02:47:42.103600 containerd[1509]: time="2024-10-09T02:47:42.103554979Z" level=info msg="Forcibly stopping sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\"" Oct 9 02:47:42.144325 sshd[5235]: Connection closed by authenticating user root 81.161.238.160 port 57000 [preauth] Oct 9 02:47:42.146629 systemd[1]: sshd@7-188.245.48.63:22-81.161.238.160:57000.service: Deactivated successfully. Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.134 [WARNING][5277] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"32c2fe78-6ebe-4ea3-8cb3-1becb914ffff", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"11bc0a840d27f1f9da32f29a15951c3b7a68b3820e5b9dacee3d6592a41947e8", Pod:"coredns-76f75df574-678dx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3259406dbce", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.134 [INFO][5277] k8s.go 608: Cleaning up netns ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.134 [INFO][5277] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" iface="eth0" netns="" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.134 [INFO][5277] k8s.go 615: Releasing IP address(es) ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.134 [INFO][5277] utils.go 188: Calico CNI releasing IP address ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.159 [INFO][5283] ipam_plugin.go 417: Releasing address using handleID ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.159 [INFO][5283] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.160 [INFO][5283] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.164 [WARNING][5283] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.164 [INFO][5283] ipam_plugin.go 445: Releasing address using workloadID ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" HandleID="k8s-pod-network.2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Workload="ci--4116--0--0--c--ec98df32e3-k8s-coredns--76f75df574--678dx-eth0" Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.166 [INFO][5283] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:42.171292 containerd[1509]: 2024-10-09 02:47:42.168 [INFO][5277] k8s.go 621: Teardown processing complete. ContainerID="2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f" Oct 9 02:47:42.172860 containerd[1509]: time="2024-10-09T02:47:42.171320125Z" level=info msg="TearDown network for sandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" successfully" Oct 9 02:47:42.174937 containerd[1509]: time="2024-10-09T02:47:42.174888749Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Oct 9 02:47:42.175207 containerd[1509]: time="2024-10-09T02:47:42.174963409Z" level=info msg="RemovePodSandbox \"2143b0ea30a5b5d8a67fef278799b60a11dae652d97e974993ee7104d0d1db5f\" returns successfully" Oct 9 02:47:42.175419 containerd[1509]: time="2024-10-09T02:47:42.175386758Z" level=info msg="StopPodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\"" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.208 [WARNING][5304] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0", GenerateName:"calico-kube-controllers-6d4cd8d49c-", Namespace:"calico-system", SelfLink:"", UID:"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d4cd8d49c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479", Pod:"calico-kube-controllers-6d4cd8d49c-l2sjn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidd72d92c6e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.208 [INFO][5304] k8s.go 608: Cleaning up netns ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.208 [INFO][5304] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" iface="eth0" netns="" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.208 [INFO][5304] k8s.go 615: Releasing IP address(es) ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.208 [INFO][5304] utils.go 188: Calico CNI releasing IP address ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.226 [INFO][5310] ipam_plugin.go 417: Releasing address using handleID ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.226 [INFO][5310] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.227 [INFO][5310] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.232 [WARNING][5310] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.232 [INFO][5310] ipam_plugin.go 445: Releasing address using workloadID ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.233 [INFO][5310] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:42.238473 containerd[1509]: 2024-10-09 02:47:42.236 [INFO][5304] k8s.go 621: Teardown processing complete. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.239080 containerd[1509]: time="2024-10-09T02:47:42.238504525Z" level=info msg="TearDown network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" successfully" Oct 9 02:47:42.239080 containerd[1509]: time="2024-10-09T02:47:42.238527529Z" level=info msg="StopPodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" returns successfully" Oct 9 02:47:42.239205 containerd[1509]: time="2024-10-09T02:47:42.239103396Z" level=info msg="RemovePodSandbox for \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\"" Oct 9 02:47:42.239205 containerd[1509]: time="2024-10-09T02:47:42.239155153Z" level=info msg="Forcibly stopping sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\"" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.274 [WARNING][5328] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0", GenerateName:"calico-kube-controllers-6d4cd8d49c-", Namespace:"calico-system", SelfLink:"", UID:"f5dd1ba5-411d-49c3-9c5f-0b47d4857d7e", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d4cd8d49c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"a036aed3a9570ae6300635def1738d79a3ddba818fcefc751ef7fc889e476479", Pod:"calico-kube-controllers-6d4cd8d49c-l2sjn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidd72d92c6e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.274 [INFO][5328] k8s.go 608: Cleaning up netns ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.274 [INFO][5328] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" iface="eth0" netns="" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.274 [INFO][5328] k8s.go 615: Releasing IP address(es) ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.274 [INFO][5328] utils.go 188: Calico CNI releasing IP address ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.294 [INFO][5334] ipam_plugin.go 417: Releasing address using handleID ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.294 [INFO][5334] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.294 [INFO][5334] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.299 [WARNING][5334] ipam_plugin.go 434: Asked to release address but it doesn't exist. Ignoring ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.299 [INFO][5334] ipam_plugin.go 445: Releasing address using workloadID ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" HandleID="k8s-pod-network.0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--kube--controllers--6d4cd8d49c--l2sjn-eth0" Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.301 [INFO][5334] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:42.306749 containerd[1509]: 2024-10-09 02:47:42.304 [INFO][5328] k8s.go 621: Teardown processing complete. ContainerID="0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817" Oct 9 02:47:42.306749 containerd[1509]: time="2024-10-09T02:47:42.306716475Z" level=info msg="TearDown network for sandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" successfully" Oct 9 02:47:42.317790 containerd[1509]: time="2024-10-09T02:47:42.317756785Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Oct 9 02:47:42.317916 containerd[1509]: time="2024-10-09T02:47:42.317873595Z" level=info msg="RemovePodSandbox \"0f39baf9b7d5f27280737427ebcd54d61dd4efc75364a9d4be61597bcada6817\" returns successfully" Oct 9 02:47:47.323500 kubelet[2877]: I1009 02:47:47.323115 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-2g2jv" podStartSLOduration=41.267315312 podStartE2EDuration="47.323080103s" podCreationTimestamp="2024-10-09 02:47:00 +0000 UTC" firstStartedPulling="2024-10-09 02:47:27.258365132 +0000 UTC m=+45.649615576" lastFinishedPulling="2024-10-09 02:47:33.314129922 +0000 UTC m=+51.705380367" observedRunningTime="2024-10-09 02:47:34.131945817 +0000 UTC m=+52.523196261" watchObservedRunningTime="2024-10-09 02:47:47.323080103 +0000 UTC m=+65.714330548" Oct 9 02:47:50.242546 kubelet[2877]: I1009 02:47:50.242259 2877 topology_manager.go:215] "Topology Admit Handler" podUID="70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b" podNamespace="calico-apiserver" podName="calico-apiserver-765b5d9bbc-bddxz" Oct 9 02:47:50.245087 kubelet[2877]: I1009 02:47:50.244554 2877 topology_manager.go:215] "Topology Admit Handler" podUID="48878379-aa6c-4383-9cec-0198c5fbfb44" podNamespace="calico-apiserver" podName="calico-apiserver-765b5d9bbc-8wp8d" Oct 9 02:47:50.269795 systemd[1]: Created slice kubepods-besteffort-pod70d4d361_ceb6_4af8_a9d7_1ef7ab08cc1b.slice - libcontainer container kubepods-besteffort-pod70d4d361_ceb6_4af8_a9d7_1ef7ab08cc1b.slice. Oct 9 02:47:50.281724 systemd[1]: Created slice kubepods-besteffort-pod48878379_aa6c_4383_9cec_0198c5fbfb44.slice - libcontainer container kubepods-besteffort-pod48878379_aa6c_4383_9cec_0198c5fbfb44.slice. Oct 9 02:47:50.326389 kubelet[2877]: I1009 02:47:50.326349 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48878379-aa6c-4383-9cec-0198c5fbfb44-calico-apiserver-certs\") pod \"calico-apiserver-765b5d9bbc-8wp8d\" (UID: \"48878379-aa6c-4383-9cec-0198c5fbfb44\") " pod="calico-apiserver/calico-apiserver-765b5d9bbc-8wp8d" Oct 9 02:47:50.326389 kubelet[2877]: I1009 02:47:50.326403 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd5zz\" (UniqueName: \"kubernetes.io/projected/48878379-aa6c-4383-9cec-0198c5fbfb44-kube-api-access-wd5zz\") pod \"calico-apiserver-765b5d9bbc-8wp8d\" (UID: \"48878379-aa6c-4383-9cec-0198c5fbfb44\") " pod="calico-apiserver/calico-apiserver-765b5d9bbc-8wp8d" Oct 9 02:47:50.326595 kubelet[2877]: I1009 02:47:50.326447 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b-calico-apiserver-certs\") pod \"calico-apiserver-765b5d9bbc-bddxz\" (UID: \"70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b\") " pod="calico-apiserver/calico-apiserver-765b5d9bbc-bddxz" Oct 9 02:47:50.326595 kubelet[2877]: I1009 02:47:50.326469 2877 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lwv4\" (UniqueName: \"kubernetes.io/projected/70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b-kube-api-access-6lwv4\") pod \"calico-apiserver-765b5d9bbc-bddxz\" (UID: \"70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b\") " pod="calico-apiserver/calico-apiserver-765b5d9bbc-bddxz" Oct 9 02:47:50.428726 kubelet[2877]: E1009 02:47:50.428632 2877 secret.go:194] Couldn't get secret calico-apiserver/calico-apiserver-certs: secret "calico-apiserver-certs" not found Oct 9 02:47:50.428726 kubelet[2877]: E1009 02:47:50.428626 2877 secret.go:194] Couldn't get secret calico-apiserver/calico-apiserver-certs: secret "calico-apiserver-certs" not found Oct 9 02:47:50.437464 kubelet[2877]: E1009 02:47:50.436694 2877 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48878379-aa6c-4383-9cec-0198c5fbfb44-calico-apiserver-certs podName:48878379-aa6c-4383-9cec-0198c5fbfb44 nodeName:}" failed. No retries permitted until 2024-10-09 02:47:50.931623402 +0000 UTC m=+69.322873857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/48878379-aa6c-4383-9cec-0198c5fbfb44-calico-apiserver-certs") pod "calico-apiserver-765b5d9bbc-8wp8d" (UID: "48878379-aa6c-4383-9cec-0198c5fbfb44") : secret "calico-apiserver-certs" not found Oct 9 02:47:50.437464 kubelet[2877]: E1009 02:47:50.436724 2877 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b-calico-apiserver-certs podName:70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b nodeName:}" failed. No retries permitted until 2024-10-09 02:47:50.936713361 +0000 UTC m=+69.327963807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b-calico-apiserver-certs") pod "calico-apiserver-765b5d9bbc-bddxz" (UID: "70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b") : secret "calico-apiserver-certs" not found Oct 9 02:47:51.177978 containerd[1509]: time="2024-10-09T02:47:51.177929813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-765b5d9bbc-bddxz,Uid:70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b,Namespace:calico-apiserver,Attempt:0,}" Oct 9 02:47:51.187069 containerd[1509]: time="2024-10-09T02:47:51.187023883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-765b5d9bbc-8wp8d,Uid:48878379-aa6c-4383-9cec-0198c5fbfb44,Namespace:calico-apiserver,Attempt:0,}" Oct 9 02:47:51.339134 systemd-networkd[1402]: cali07ff658a1ac: Link UP Oct 9 02:47:51.339416 systemd-networkd[1402]: cali07ff658a1ac: Gained carrier Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.256 [INFO][5412] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0 calico-apiserver-765b5d9bbc- calico-apiserver 48878379-aa6c-4383-9cec-0198c5fbfb44 920 0 2024-10-09 02:47:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:765b5d9bbc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4116-0-0-c-ec98df32e3 calico-apiserver-765b5d9bbc-8wp8d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali07ff658a1ac [] []}} ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.256 [INFO][5412] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.292 [INFO][5432] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" HandleID="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.306 [INFO][5432] ipam_plugin.go 270: Auto assigning IP ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" HandleID="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318350), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4116-0-0-c-ec98df32e3", "pod":"calico-apiserver-765b5d9bbc-8wp8d", "timestamp":"2024-10-09 02:47:51.292908189 +0000 UTC"}, Hostname:"ci-4116-0-0-c-ec98df32e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.306 [INFO][5432] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.306 [INFO][5432] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.306 [INFO][5432] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4116-0-0-c-ec98df32e3' Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.309 [INFO][5432] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.312 [INFO][5432] ipam.go 372: Looking up existing affinities for host host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.316 [INFO][5432] ipam.go 489: Trying affinity for 192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.317 [INFO][5432] ipam.go 155: Attempting to load block cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.319 [INFO][5432] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.319 [INFO][5432] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.192/26 handle="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.320 [INFO][5432] ipam.go 1685: Creating new handle: k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617 Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.324 [INFO][5432] ipam.go 1203: Writing block in order to claim IPs block=192.168.96.192/26 handle="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.329 [INFO][5432] ipam.go 1216: Successfully claimed IPs: [192.168.96.197/26] block=192.168.96.192/26 handle="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.329 [INFO][5432] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.197/26] handle="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.330 [INFO][5432] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:51.358945 containerd[1509]: 2024-10-09 02:47:51.330 [INFO][5432] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.96.197/26] IPv6=[] ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" HandleID="k8s-pod-network.154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.363072 containerd[1509]: 2024-10-09 02:47:51.334 [INFO][5412] k8s.go 386: Populated endpoint ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0", GenerateName:"calico-apiserver-765b5d9bbc-", Namespace:"calico-apiserver", SelfLink:"", UID:"48878379-aa6c-4383-9cec-0198c5fbfb44", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"765b5d9bbc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"", Pod:"calico-apiserver-765b5d9bbc-8wp8d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali07ff658a1ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:51.363072 containerd[1509]: 2024-10-09 02:47:51.334 [INFO][5412] k8s.go 387: Calico CNI using IPs: [192.168.96.197/32] ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.363072 containerd[1509]: 2024-10-09 02:47:51.335 [INFO][5412] dataplane_linux.go 68: Setting the host side veth name to cali07ff658a1ac ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.363072 containerd[1509]: 2024-10-09 02:47:51.338 [INFO][5412] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.363072 containerd[1509]: 2024-10-09 02:47:51.341 [INFO][5412] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0", GenerateName:"calico-apiserver-765b5d9bbc-", Namespace:"calico-apiserver", SelfLink:"", UID:"48878379-aa6c-4383-9cec-0198c5fbfb44", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"765b5d9bbc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617", Pod:"calico-apiserver-765b5d9bbc-8wp8d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali07ff658a1ac", MAC:"56:23:81:3d:29:a0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:51.363072 containerd[1509]: 2024-10-09 02:47:51.350 [INFO][5412] k8s.go 500: Wrote updated endpoint to datastore ContainerID="154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-8wp8d" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--8wp8d-eth0" Oct 9 02:47:51.410837 systemd-networkd[1402]: cali1cbf8cb21f5: Link UP Oct 9 02:47:51.413906 systemd-networkd[1402]: cali1cbf8cb21f5: Gained carrier Oct 9 02:47:51.414866 containerd[1509]: time="2024-10-09T02:47:51.413037438Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:51.414866 containerd[1509]: time="2024-10-09T02:47:51.413082823Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:51.414866 containerd[1509]: time="2024-10-09T02:47:51.413092321Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:51.414866 containerd[1509]: time="2024-10-09T02:47:51.413320431Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.240 [INFO][5404] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0 calico-apiserver-765b5d9bbc- calico-apiserver 70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b 917 0 2024-10-09 02:47:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:765b5d9bbc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4116-0-0-c-ec98df32e3 calico-apiserver-765b5d9bbc-bddxz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1cbf8cb21f5 [] []}} ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.240 [INFO][5404] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.295 [INFO][5428] ipam_plugin.go 230: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" HandleID="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.309 [INFO][5428] ipam_plugin.go 270: Auto assigning IP ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" HandleID="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318400), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4116-0-0-c-ec98df32e3", "pod":"calico-apiserver-765b5d9bbc-bddxz", "timestamp":"2024-10-09 02:47:51.295912244 +0000 UTC"}, Hostname:"ci-4116-0-0-c-ec98df32e3", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.309 [INFO][5428] ipam_plugin.go 358: About to acquire host-wide IPAM lock. Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.329 [INFO][5428] ipam_plugin.go 373: Acquired host-wide IPAM lock. Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.329 [INFO][5428] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4116-0-0-c-ec98df32e3' Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.334 [INFO][5428] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.343 [INFO][5428] ipam.go 372: Looking up existing affinities for host host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.354 [INFO][5428] ipam.go 489: Trying affinity for 192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.359 [INFO][5428] ipam.go 155: Attempting to load block cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.365 [INFO][5428] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.192/26 host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.365 [INFO][5428] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.192/26 handle="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.370 [INFO][5428] ipam.go 1685: Creating new handle: k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95 Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.382 [INFO][5428] ipam.go 1203: Writing block in order to claim IPs block=192.168.96.192/26 handle="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.394 [INFO][5428] ipam.go 1216: Successfully claimed IPs: [192.168.96.198/26] block=192.168.96.192/26 handle="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.395 [INFO][5428] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.198/26] handle="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" host="ci-4116-0-0-c-ec98df32e3" Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.395 [INFO][5428] ipam_plugin.go 379: Released host-wide IPAM lock. Oct 9 02:47:51.434756 containerd[1509]: 2024-10-09 02:47:51.396 [INFO][5428] ipam_plugin.go 288: Calico CNI IPAM assigned addresses IPv4=[192.168.96.198/26] IPv6=[] ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" HandleID="k8s-pod-network.7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Workload="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.437126 containerd[1509]: 2024-10-09 02:47:51.405 [INFO][5404] k8s.go 386: Populated endpoint ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0", GenerateName:"calico-apiserver-765b5d9bbc-", Namespace:"calico-apiserver", SelfLink:"", UID:"70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"765b5d9bbc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"", Pod:"calico-apiserver-765b5d9bbc-bddxz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1cbf8cb21f5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:51.437126 containerd[1509]: 2024-10-09 02:47:51.405 [INFO][5404] k8s.go 387: Calico CNI using IPs: [192.168.96.198/32] ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.437126 containerd[1509]: 2024-10-09 02:47:51.406 [INFO][5404] dataplane_linux.go 68: Setting the host side veth name to cali1cbf8cb21f5 ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.437126 containerd[1509]: 2024-10-09 02:47:51.411 [INFO][5404] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.437126 containerd[1509]: 2024-10-09 02:47:51.414 [INFO][5404] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0", GenerateName:"calico-apiserver-765b5d9bbc-", Namespace:"calico-apiserver", SelfLink:"", UID:"70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2024, time.October, 9, 2, 47, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"765b5d9bbc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4116-0-0-c-ec98df32e3", ContainerID:"7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95", Pod:"calico-apiserver-765b5d9bbc-bddxz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1cbf8cb21f5", MAC:"d2:0c:90:28:ab:f1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Oct 9 02:47:51.437126 containerd[1509]: 2024-10-09 02:47:51.426 [INFO][5404] k8s.go 500: Wrote updated endpoint to datastore ContainerID="7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95" Namespace="calico-apiserver" Pod="calico-apiserver-765b5d9bbc-bddxz" WorkloadEndpoint="ci--4116--0--0--c--ec98df32e3-k8s-calico--apiserver--765b5d9bbc--bddxz-eth0" Oct 9 02:47:51.465724 systemd[1]: Started cri-containerd-154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617.scope - libcontainer container 154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617. Oct 9 02:47:51.477558 containerd[1509]: time="2024-10-09T02:47:51.477464684Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Oct 9 02:47:51.478149 containerd[1509]: time="2024-10-09T02:47:51.477951081Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Oct 9 02:47:51.478236 containerd[1509]: time="2024-10-09T02:47:51.478149505Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:51.478558 containerd[1509]: time="2024-10-09T02:47:51.478499596Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Oct 9 02:47:51.512742 systemd[1]: Started cri-containerd-7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95.scope - libcontainer container 7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95. Oct 9 02:47:51.537014 containerd[1509]: time="2024-10-09T02:47:51.536573761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-765b5d9bbc-8wp8d,Uid:48878379-aa6c-4383-9cec-0198c5fbfb44,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617\"" Oct 9 02:47:51.539643 containerd[1509]: time="2024-10-09T02:47:51.539619043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\"" Oct 9 02:47:51.569422 containerd[1509]: time="2024-10-09T02:47:51.569363776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-765b5d9bbc-bddxz,Uid:70d4d361-ceb6-4af8-a9d7-1ef7ab08cc1b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95\"" Oct 9 02:47:52.504633 systemd-networkd[1402]: cali1cbf8cb21f5: Gained IPv6LL Oct 9 02:47:52.697065 systemd-networkd[1402]: cali07ff658a1ac: Gained IPv6LL Oct 9 02:47:54.219370 containerd[1509]: time="2024-10-09T02:47:54.219320676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:54.220491 containerd[1509]: time="2024-10-09T02:47:54.220452810Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.28.1: active requests=0, bytes read=40419849" Oct 9 02:47:54.221589 containerd[1509]: time="2024-10-09T02:47:54.221551913Z" level=info msg="ImageCreate event name:\"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:54.223598 containerd[1509]: time="2024-10-09T02:47:54.223565540Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b4ee1aa27bdeddc34dd200145eb033b716cf598570206c96693a35a317ab4f1e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:54.224159 containerd[1509]: time="2024-10-09T02:47:54.224133350Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" with image id \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b4ee1aa27bdeddc34dd200145eb033b716cf598570206c96693a35a317ab4f1e\", size \"41912266\" in 2.684486934s" Oct 9 02:47:54.224660 containerd[1509]: time="2024-10-09T02:47:54.224158699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" returns image reference \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\"" Oct 9 02:47:54.225330 containerd[1509]: time="2024-10-09T02:47:54.225205983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\"" Oct 9 02:47:54.226154 containerd[1509]: time="2024-10-09T02:47:54.226048602Z" level=info msg="CreateContainer within sandbox \"154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Oct 9 02:47:54.254371 containerd[1509]: time="2024-10-09T02:47:54.253461660Z" level=info msg="CreateContainer within sandbox \"154f08ea6656096609e9ab4c46d6db15ddb6e45a9f31e799de18b077dd5f9617\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c384fb9a94103a46035c95fa8e3ea8f8597d737498d6c0be164f3b7b80df04e8\"" Oct 9 02:47:54.254194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2639126.mount: Deactivated successfully. Oct 9 02:47:54.256057 containerd[1509]: time="2024-10-09T02:47:54.256012510Z" level=info msg="StartContainer for \"c384fb9a94103a46035c95fa8e3ea8f8597d737498d6c0be164f3b7b80df04e8\"" Oct 9 02:47:54.289593 systemd[1]: Started cri-containerd-c384fb9a94103a46035c95fa8e3ea8f8597d737498d6c0be164f3b7b80df04e8.scope - libcontainer container c384fb9a94103a46035c95fa8e3ea8f8597d737498d6c0be164f3b7b80df04e8. Oct 9 02:47:54.328590 containerd[1509]: time="2024-10-09T02:47:54.327826577Z" level=info msg="StartContainer for \"c384fb9a94103a46035c95fa8e3ea8f8597d737498d6c0be164f3b7b80df04e8\" returns successfully" Oct 9 02:47:54.624000 containerd[1509]: time="2024-10-09T02:47:54.623692912Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.28.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 9 02:47:54.626085 containerd[1509]: time="2024-10-09T02:47:54.626041740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.28.1: active requests=0, bytes read=77" Oct 9 02:47:54.629210 containerd[1509]: time="2024-10-09T02:47:54.629174347Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" with image id \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.28.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b4ee1aa27bdeddc34dd200145eb033b716cf598570206c96693a35a317ab4f1e\", size \"41912266\" in 403.927477ms" Oct 9 02:47:54.629344 containerd[1509]: time="2024-10-09T02:47:54.629211066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.1\" returns image reference \"sha256:91dd0fd3dab3f170b52404ec5e67926439207bf71c08b7f54de8f3db6209537b\"" Oct 9 02:47:54.632555 containerd[1509]: time="2024-10-09T02:47:54.632524904Z" level=info msg="CreateContainer within sandbox \"7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Oct 9 02:47:54.656867 containerd[1509]: time="2024-10-09T02:47:54.656802771Z" level=info msg="CreateContainer within sandbox \"7ffc039dc152c515b7437b7ed9e88c8778efe7fe8229f6d447147532c56ffa95\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"df46fa73e4f936559b93a9144614b5f7894142655d1a25d16f4477f04b58e8b1\"" Oct 9 02:47:54.660483 containerd[1509]: time="2024-10-09T02:47:54.659378678Z" level=info msg="StartContainer for \"df46fa73e4f936559b93a9144614b5f7894142655d1a25d16f4477f04b58e8b1\"" Oct 9 02:47:54.691924 systemd[1]: Started cri-containerd-df46fa73e4f936559b93a9144614b5f7894142655d1a25d16f4477f04b58e8b1.scope - libcontainer container df46fa73e4f936559b93a9144614b5f7894142655d1a25d16f4477f04b58e8b1. Oct 9 02:47:54.742299 containerd[1509]: time="2024-10-09T02:47:54.742260644Z" level=info msg="StartContainer for \"df46fa73e4f936559b93a9144614b5f7894142655d1a25d16f4477f04b58e8b1\" returns successfully" Oct 9 02:47:55.210364 kubelet[2877]: I1009 02:47:55.209844 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-765b5d9bbc-8wp8d" podStartSLOduration=2.5234461809999997 podStartE2EDuration="5.209804865s" podCreationTimestamp="2024-10-09 02:47:50 +0000 UTC" firstStartedPulling="2024-10-09 02:47:51.538327638 +0000 UTC m=+69.929578084" lastFinishedPulling="2024-10-09 02:47:54.224686324 +0000 UTC m=+72.615936768" observedRunningTime="2024-10-09 02:47:55.207148226 +0000 UTC m=+73.598398671" watchObservedRunningTime="2024-10-09 02:47:55.209804865 +0000 UTC m=+73.601055310" Oct 9 02:47:55.210364 kubelet[2877]: I1009 02:47:55.209998 2877 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-765b5d9bbc-bddxz" podStartSLOduration=2.151175281 podStartE2EDuration="5.20998268s" podCreationTimestamp="2024-10-09 02:47:50 +0000 UTC" firstStartedPulling="2024-10-09 02:47:51.570743759 +0000 UTC m=+69.961994204" lastFinishedPulling="2024-10-09 02:47:54.629551158 +0000 UTC m=+73.020801603" observedRunningTime="2024-10-09 02:47:55.197408863 +0000 UTC m=+73.588659308" watchObservedRunningTime="2024-10-09 02:47:55.20998268 +0000 UTC m=+73.601233125" Oct 9 02:48:35.406960 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.qnrsuL.mount: Deactivated successfully. Oct 9 02:49:17.252052 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.k2mXFW.mount: Deactivated successfully. Oct 9 02:50:43.536829 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.0rAqzX.mount: Deactivated successfully. Oct 9 02:50:47.249733 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.eYLQkj.mount: Deactivated successfully. Oct 9 02:51:30.183950 systemd[1]: Started sshd@8-188.245.48.63:22-80.64.30.139:42432.service - OpenSSH per-connection server daemon (80.64.30.139:42432). Oct 9 02:51:31.356224 sshd[6192]: Invalid user admin from 80.64.30.139 port 42432 Oct 9 02:51:31.428699 sshd[6192]: Connection closed by invalid user admin 80.64.30.139 port 42432 [preauth] Oct 9 02:51:31.434631 systemd[1]: sshd@8-188.245.48.63:22-80.64.30.139:42432.service: Deactivated successfully. Oct 9 02:51:32.784844 systemd[1]: Started sshd@9-188.245.48.63:22-139.178.68.195:47892.service - OpenSSH per-connection server daemon (139.178.68.195:47892). Oct 9 02:51:33.776891 sshd[6206]: Accepted publickey for core from 139.178.68.195 port 47892 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:51:33.780043 sshd[6206]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:51:33.786301 systemd-logind[1493]: New session 8 of user core. Oct 9 02:51:33.790583 systemd[1]: Started session-8.scope - Session 8 of User core. Oct 9 02:51:34.878925 sshd[6206]: pam_unix(sshd:session): session closed for user core Oct 9 02:51:34.884817 systemd-logind[1493]: Session 8 logged out. Waiting for processes to exit. Oct 9 02:51:34.885825 systemd[1]: sshd@9-188.245.48.63:22-139.178.68.195:47892.service: Deactivated successfully. Oct 9 02:51:34.888587 systemd[1]: session-8.scope: Deactivated successfully. Oct 9 02:51:34.892223 systemd-logind[1493]: Removed session 8. Oct 9 02:51:40.046009 systemd[1]: Started sshd@10-188.245.48.63:22-139.178.68.195:47906.service - OpenSSH per-connection server daemon (139.178.68.195:47906). Oct 9 02:51:41.058117 sshd[6238]: Accepted publickey for core from 139.178.68.195 port 47906 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:51:41.061307 sshd[6238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:51:41.066654 systemd-logind[1493]: New session 9 of user core. Oct 9 02:51:41.069616 systemd[1]: Started session-9.scope - Session 9 of User core. Oct 9 02:51:41.871097 sshd[6238]: pam_unix(sshd:session): session closed for user core Oct 9 02:51:41.874541 systemd[1]: sshd@10-188.245.48.63:22-139.178.68.195:47906.service: Deactivated successfully. Oct 9 02:51:41.877037 systemd[1]: session-9.scope: Deactivated successfully. Oct 9 02:51:41.879396 systemd-logind[1493]: Session 9 logged out. Waiting for processes to exit. Oct 9 02:51:41.881183 systemd-logind[1493]: Removed session 9. Oct 9 02:51:47.042582 systemd[1]: Started sshd@11-188.245.48.63:22-139.178.68.195:43482.service - OpenSSH per-connection server daemon (139.178.68.195:43482). Oct 9 02:51:48.042853 sshd[6277]: Accepted publickey for core from 139.178.68.195 port 43482 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:51:48.046935 sshd[6277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:51:48.057260 systemd-logind[1493]: New session 10 of user core. Oct 9 02:51:48.063685 systemd[1]: Started session-10.scope - Session 10 of User core. Oct 9 02:51:48.943153 sshd[6277]: pam_unix(sshd:session): session closed for user core Oct 9 02:51:48.949942 systemd[1]: sshd@11-188.245.48.63:22-139.178.68.195:43482.service: Deactivated successfully. Oct 9 02:51:48.952344 systemd[1]: session-10.scope: Deactivated successfully. Oct 9 02:51:48.954344 systemd-logind[1493]: Session 10 logged out. Waiting for processes to exit. Oct 9 02:51:48.955902 systemd-logind[1493]: Removed session 10. Oct 9 02:51:54.119009 systemd[1]: Started sshd@12-188.245.48.63:22-139.178.68.195:44654.service - OpenSSH per-connection server daemon (139.178.68.195:44654). Oct 9 02:51:55.143693 sshd[6317]: Accepted publickey for core from 139.178.68.195 port 44654 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:51:55.146076 sshd[6317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:51:55.151218 systemd-logind[1493]: New session 11 of user core. Oct 9 02:51:55.159625 systemd[1]: Started session-11.scope - Session 11 of User core. Oct 9 02:51:55.918310 sshd[6317]: pam_unix(sshd:session): session closed for user core Oct 9 02:51:55.922231 systemd-logind[1493]: Session 11 logged out. Waiting for processes to exit. Oct 9 02:51:55.922902 systemd[1]: sshd@12-188.245.48.63:22-139.178.68.195:44654.service: Deactivated successfully. Oct 9 02:51:55.925310 systemd[1]: session-11.scope: Deactivated successfully. Oct 9 02:51:55.926884 systemd-logind[1493]: Removed session 11. Oct 9 02:52:01.098720 systemd[1]: Started sshd@13-188.245.48.63:22-139.178.68.195:52598.service - OpenSSH per-connection server daemon (139.178.68.195:52598). Oct 9 02:52:02.130216 sshd[6333]: Accepted publickey for core from 139.178.68.195 port 52598 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:02.132219 sshd[6333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:02.137543 systemd-logind[1493]: New session 12 of user core. Oct 9 02:52:02.143625 systemd[1]: Started session-12.scope - Session 12 of User core. Oct 9 02:52:02.891101 sshd[6333]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:02.894577 systemd[1]: sshd@13-188.245.48.63:22-139.178.68.195:52598.service: Deactivated successfully. Oct 9 02:52:02.897304 systemd[1]: session-12.scope: Deactivated successfully. Oct 9 02:52:02.899492 systemd-logind[1493]: Session 12 logged out. Waiting for processes to exit. Oct 9 02:52:02.900691 systemd-logind[1493]: Removed session 12. Oct 9 02:52:08.067762 systemd[1]: Started sshd@14-188.245.48.63:22-139.178.68.195:52600.service - OpenSSH per-connection server daemon (139.178.68.195:52600). Oct 9 02:52:09.066747 sshd[6365]: Accepted publickey for core from 139.178.68.195 port 52600 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:09.068553 sshd[6365]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:09.073166 systemd-logind[1493]: New session 13 of user core. Oct 9 02:52:09.077609 systemd[1]: Started session-13.scope - Session 13 of User core. Oct 9 02:52:09.832277 sshd[6365]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:09.836535 systemd[1]: sshd@14-188.245.48.63:22-139.178.68.195:52600.service: Deactivated successfully. Oct 9 02:52:09.838777 systemd[1]: session-13.scope: Deactivated successfully. Oct 9 02:52:09.839581 systemd-logind[1493]: Session 13 logged out. Waiting for processes to exit. Oct 9 02:52:09.840990 systemd-logind[1493]: Removed session 13. Oct 9 02:52:15.013811 systemd[1]: Started sshd@15-188.245.48.63:22-139.178.68.195:35164.service - OpenSSH per-connection server daemon (139.178.68.195:35164). Oct 9 02:52:16.029214 sshd[6405]: Accepted publickey for core from 139.178.68.195 port 35164 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:16.031119 sshd[6405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:16.036357 systemd-logind[1493]: New session 14 of user core. Oct 9 02:52:16.044586 systemd[1]: Started session-14.scope - Session 14 of User core. Oct 9 02:52:16.795403 sshd[6405]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:16.799696 systemd[1]: sshd@15-188.245.48.63:22-139.178.68.195:35164.service: Deactivated successfully. Oct 9 02:52:16.802142 systemd[1]: session-14.scope: Deactivated successfully. Oct 9 02:52:16.802896 systemd-logind[1493]: Session 14 logged out. Waiting for processes to exit. Oct 9 02:52:16.804176 systemd-logind[1493]: Removed session 14. Oct 9 02:52:21.982854 systemd[1]: Started sshd@16-188.245.48.63:22-139.178.68.195:46556.service - OpenSSH per-connection server daemon (139.178.68.195:46556). Oct 9 02:52:23.008711 sshd[6441]: Accepted publickey for core from 139.178.68.195 port 46556 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:23.015608 sshd[6441]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:23.028819 systemd-logind[1493]: New session 15 of user core. Oct 9 02:52:23.036534 systemd[1]: Started session-15.scope - Session 15 of User core. Oct 9 02:52:23.766382 sshd[6441]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:23.770406 systemd[1]: sshd@16-188.245.48.63:22-139.178.68.195:46556.service: Deactivated successfully. Oct 9 02:52:23.773574 systemd[1]: session-15.scope: Deactivated successfully. Oct 9 02:52:23.774309 systemd-logind[1493]: Session 15 logged out. Waiting for processes to exit. Oct 9 02:52:23.775284 systemd-logind[1493]: Removed session 15. Oct 9 02:52:28.953685 systemd[1]: Started sshd@17-188.245.48.63:22-139.178.68.195:46570.service - OpenSSH per-connection server daemon (139.178.68.195:46570). Oct 9 02:52:29.946346 sshd[6462]: Accepted publickey for core from 139.178.68.195 port 46570 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:29.948731 sshd[6462]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:29.953469 systemd-logind[1493]: New session 16 of user core. Oct 9 02:52:29.956557 systemd[1]: Started session-16.scope - Session 16 of User core. Oct 9 02:52:30.727614 sshd[6462]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:30.734489 systemd[1]: sshd@17-188.245.48.63:22-139.178.68.195:46570.service: Deactivated successfully. Oct 9 02:52:30.738501 systemd[1]: session-16.scope: Deactivated successfully. Oct 9 02:52:30.740206 systemd-logind[1493]: Session 16 logged out. Waiting for processes to exit. Oct 9 02:52:30.742110 systemd-logind[1493]: Removed session 16. Oct 9 02:52:35.362553 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.ugp53H.mount: Deactivated successfully. Oct 9 02:52:35.911166 systemd[1]: Started sshd@18-188.245.48.63:22-139.178.68.195:47942.service - OpenSSH per-connection server daemon (139.178.68.195:47942). Oct 9 02:52:36.924486 sshd[6500]: Accepted publickey for core from 139.178.68.195 port 47942 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:36.929010 sshd[6500]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:36.938083 systemd-logind[1493]: New session 17 of user core. Oct 9 02:52:36.944633 systemd[1]: Started session-17.scope - Session 17 of User core. Oct 9 02:52:37.700841 sshd[6500]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:37.707598 systemd-logind[1493]: Session 17 logged out. Waiting for processes to exit. Oct 9 02:52:37.708832 systemd[1]: sshd@18-188.245.48.63:22-139.178.68.195:47942.service: Deactivated successfully. Oct 9 02:52:37.712254 systemd[1]: session-17.scope: Deactivated successfully. Oct 9 02:52:37.713471 systemd-logind[1493]: Removed session 17. Oct 9 02:52:42.880581 systemd[1]: Started sshd@19-188.245.48.63:22-139.178.68.195:38622.service - OpenSSH per-connection server daemon (139.178.68.195:38622). Oct 9 02:52:43.900000 sshd[6522]: Accepted publickey for core from 139.178.68.195 port 38622 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:43.904661 sshd[6522]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:43.913209 systemd-logind[1493]: New session 18 of user core. Oct 9 02:52:43.923690 systemd[1]: Started session-18.scope - Session 18 of User core. Oct 9 02:52:44.734906 sshd[6522]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:44.740954 systemd-logind[1493]: Session 18 logged out. Waiting for processes to exit. Oct 9 02:52:44.741742 systemd[1]: sshd@19-188.245.48.63:22-139.178.68.195:38622.service: Deactivated successfully. Oct 9 02:52:44.744202 systemd[1]: session-18.scope: Deactivated successfully. Oct 9 02:52:44.746161 systemd-logind[1493]: Removed session 18. Oct 9 02:52:49.916886 systemd[1]: Started sshd@20-188.245.48.63:22-139.178.68.195:38626.service - OpenSSH per-connection server daemon (139.178.68.195:38626). Oct 9 02:52:50.929854 sshd[6585]: Accepted publickey for core from 139.178.68.195 port 38626 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:50.933003 sshd[6585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:50.942503 systemd-logind[1493]: New session 19 of user core. Oct 9 02:52:50.948733 systemd[1]: Started session-19.scope - Session 19 of User core. Oct 9 02:52:51.692767 sshd[6585]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:51.700094 systemd[1]: sshd@20-188.245.48.63:22-139.178.68.195:38626.service: Deactivated successfully. Oct 9 02:52:51.704678 systemd[1]: session-19.scope: Deactivated successfully. Oct 9 02:52:51.706366 systemd-logind[1493]: Session 19 logged out. Waiting for processes to exit. Oct 9 02:52:51.708932 systemd-logind[1493]: Removed session 19. Oct 9 02:52:56.871898 systemd[1]: Started sshd@21-188.245.48.63:22-139.178.68.195:47376.service - OpenSSH per-connection server daemon (139.178.68.195:47376). Oct 9 02:52:57.879991 sshd[6607]: Accepted publickey for core from 139.178.68.195 port 47376 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:52:57.882743 sshd[6607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:52:57.891252 systemd-logind[1493]: New session 20 of user core. Oct 9 02:52:57.896566 systemd[1]: Started session-20.scope - Session 20 of User core. Oct 9 02:52:58.681675 sshd[6607]: pam_unix(sshd:session): session closed for user core Oct 9 02:52:58.686615 systemd[1]: sshd@21-188.245.48.63:22-139.178.68.195:47376.service: Deactivated successfully. Oct 9 02:52:58.690041 systemd[1]: session-20.scope: Deactivated successfully. Oct 9 02:52:58.691249 systemd-logind[1493]: Session 20 logged out. Waiting for processes to exit. Oct 9 02:52:58.692546 systemd-logind[1493]: Removed session 20. Oct 9 02:53:03.858967 systemd[1]: Started sshd@22-188.245.48.63:22-139.178.68.195:41580.service - OpenSSH per-connection server daemon (139.178.68.195:41580). Oct 9 02:53:04.868575 sshd[6621]: Accepted publickey for core from 139.178.68.195 port 41580 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:04.872214 sshd[6621]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:04.881364 systemd-logind[1493]: New session 21 of user core. Oct 9 02:53:04.886662 systemd[1]: Started session-21.scope - Session 21 of User core. Oct 9 02:53:05.629876 sshd[6621]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:05.633816 systemd[1]: sshd@22-188.245.48.63:22-139.178.68.195:41580.service: Deactivated successfully. Oct 9 02:53:05.635878 systemd[1]: session-21.scope: Deactivated successfully. Oct 9 02:53:05.636523 systemd-logind[1493]: Session 21 logged out. Waiting for processes to exit. Oct 9 02:53:05.637768 systemd-logind[1493]: Removed session 21. Oct 9 02:53:10.822905 systemd[1]: Started sshd@23-188.245.48.63:22-139.178.68.195:40586.service - OpenSSH per-connection server daemon (139.178.68.195:40586). Oct 9 02:53:11.931093 sshd[6640]: Accepted publickey for core from 139.178.68.195 port 40586 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:11.936525 sshd[6640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:11.944564 systemd-logind[1493]: New session 22 of user core. Oct 9 02:53:11.950692 systemd[1]: Started session-22.scope - Session 22 of User core. Oct 9 02:53:12.819371 sshd[6640]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:12.825258 systemd[1]: sshd@23-188.245.48.63:22-139.178.68.195:40586.service: Deactivated successfully. Oct 9 02:53:12.825623 systemd-logind[1493]: Session 22 logged out. Waiting for processes to exit. Oct 9 02:53:12.829313 systemd[1]: session-22.scope: Deactivated successfully. Oct 9 02:53:12.831889 systemd-logind[1493]: Removed session 22. Oct 9 02:53:18.001748 systemd[1]: Started sshd@24-188.245.48.63:22-139.178.68.195:40596.service - OpenSSH per-connection server daemon (139.178.68.195:40596). Oct 9 02:53:19.004502 sshd[6702]: Accepted publickey for core from 139.178.68.195 port 40596 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:19.007331 sshd[6702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:19.016237 systemd-logind[1493]: New session 23 of user core. Oct 9 02:53:19.020686 systemd[1]: Started session-23.scope - Session 23 of User core. Oct 9 02:53:19.794843 sshd[6702]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:19.800646 systemd[1]: sshd@24-188.245.48.63:22-139.178.68.195:40596.service: Deactivated successfully. Oct 9 02:53:19.803700 systemd[1]: session-23.scope: Deactivated successfully. Oct 9 02:53:19.804675 systemd-logind[1493]: Session 23 logged out. Waiting for processes to exit. Oct 9 02:53:19.806511 systemd-logind[1493]: Removed session 23. Oct 9 02:53:24.978770 systemd[1]: Started sshd@25-188.245.48.63:22-139.178.68.195:33868.service - OpenSSH per-connection server daemon (139.178.68.195:33868). Oct 9 02:53:25.988759 sshd[6716]: Accepted publickey for core from 139.178.68.195 port 33868 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:25.990853 sshd[6716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:25.996156 systemd-logind[1493]: New session 24 of user core. Oct 9 02:53:25.999571 systemd[1]: Started session-24.scope - Session 24 of User core. Oct 9 02:53:26.800572 sshd[6716]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:26.805882 systemd[1]: sshd@25-188.245.48.63:22-139.178.68.195:33868.service: Deactivated successfully. Oct 9 02:53:26.809281 systemd[1]: session-24.scope: Deactivated successfully. Oct 9 02:53:26.811317 systemd-logind[1493]: Session 24 logged out. Waiting for processes to exit. Oct 9 02:53:26.812871 systemd-logind[1493]: Removed session 24. Oct 9 02:53:31.980775 systemd[1]: Started sshd@26-188.245.48.63:22-139.178.68.195:59154.service - OpenSSH per-connection server daemon (139.178.68.195:59154). Oct 9 02:53:33.021462 sshd[6737]: Accepted publickey for core from 139.178.68.195 port 59154 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:33.023536 sshd[6737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:33.028847 systemd-logind[1493]: New session 25 of user core. Oct 9 02:53:33.034593 systemd[1]: Started session-25.scope - Session 25 of User core. Oct 9 02:53:33.847210 sshd[6737]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:33.852714 systemd[1]: sshd@26-188.245.48.63:22-139.178.68.195:59154.service: Deactivated successfully. Oct 9 02:53:33.856183 systemd[1]: session-25.scope: Deactivated successfully. Oct 9 02:53:33.859675 systemd-logind[1493]: Session 25 logged out. Waiting for processes to exit. Oct 9 02:53:33.861572 systemd-logind[1493]: Removed session 25. Oct 9 02:53:35.364507 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.OJUQVr.mount: Deactivated successfully. Oct 9 02:53:39.024776 systemd[1]: Started sshd@27-188.245.48.63:22-139.178.68.195:59158.service - OpenSSH per-connection server daemon (139.178.68.195:59158). Oct 9 02:53:40.030813 sshd[6775]: Accepted publickey for core from 139.178.68.195 port 59158 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:40.033837 sshd[6775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:40.041835 systemd-logind[1493]: New session 26 of user core. Oct 9 02:53:40.051680 systemd[1]: Started session-26.scope - Session 26 of User core. Oct 9 02:53:40.842053 sshd[6775]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:40.846149 systemd[1]: sshd@27-188.245.48.63:22-139.178.68.195:59158.service: Deactivated successfully. Oct 9 02:53:40.848647 systemd[1]: session-26.scope: Deactivated successfully. Oct 9 02:53:40.849877 systemd-logind[1493]: Session 26 logged out. Waiting for processes to exit. Oct 9 02:53:40.851134 systemd-logind[1493]: Removed session 26. Oct 9 02:53:46.021875 systemd[1]: Started sshd@28-188.245.48.63:22-139.178.68.195:41882.service - OpenSSH per-connection server daemon (139.178.68.195:41882). Oct 9 02:53:47.025954 sshd[6822]: Accepted publickey for core from 139.178.68.195 port 41882 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:47.027513 sshd[6822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:47.031573 systemd-logind[1493]: New session 27 of user core. Oct 9 02:53:47.038572 systemd[1]: Started session-27.scope - Session 27 of User core. Oct 9 02:53:47.247090 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.Crm6Tu.mount: Deactivated successfully. Oct 9 02:53:47.853283 sshd[6822]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:47.860391 systemd[1]: sshd@28-188.245.48.63:22-139.178.68.195:41882.service: Deactivated successfully. Oct 9 02:53:47.862995 systemd[1]: session-27.scope: Deactivated successfully. Oct 9 02:53:47.864169 systemd-logind[1493]: Session 27 logged out. Waiting for processes to exit. Oct 9 02:53:47.865298 systemd-logind[1493]: Removed session 27. Oct 9 02:53:53.033733 systemd[1]: Started sshd@29-188.245.48.63:22-139.178.68.195:51608.service - OpenSSH per-connection server daemon (139.178.68.195:51608). Oct 9 02:53:54.045759 sshd[6862]: Accepted publickey for core from 139.178.68.195 port 51608 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:53:54.047953 sshd[6862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:53:54.053750 systemd-logind[1493]: New session 28 of user core. Oct 9 02:53:54.059611 systemd[1]: Started session-28.scope - Session 28 of User core. Oct 9 02:53:54.803037 sshd[6862]: pam_unix(sshd:session): session closed for user core Oct 9 02:53:54.806467 systemd[1]: sshd@29-188.245.48.63:22-139.178.68.195:51608.service: Deactivated successfully. Oct 9 02:53:54.808950 systemd[1]: session-28.scope: Deactivated successfully. Oct 9 02:53:54.811427 systemd-logind[1493]: Session 28 logged out. Waiting for processes to exit. Oct 9 02:53:54.813712 systemd-logind[1493]: Removed session 28. Oct 9 02:53:59.980753 systemd[1]: Started sshd@30-188.245.48.63:22-139.178.68.195:51616.service - OpenSSH per-connection server daemon (139.178.68.195:51616). Oct 9 02:54:00.984003 sshd[6883]: Accepted publickey for core from 139.178.68.195 port 51616 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:00.985544 sshd[6883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:00.989655 systemd-logind[1493]: New session 29 of user core. Oct 9 02:54:00.999594 systemd[1]: Started session-29.scope - Session 29 of User core. Oct 9 02:54:01.764333 sshd[6883]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:01.768130 systemd[1]: sshd@30-188.245.48.63:22-139.178.68.195:51616.service: Deactivated successfully. Oct 9 02:54:01.770128 systemd[1]: session-29.scope: Deactivated successfully. Oct 9 02:54:01.771242 systemd-logind[1493]: Session 29 logged out. Waiting for processes to exit. Oct 9 02:54:01.772295 systemd-logind[1493]: Removed session 29. Oct 9 02:54:06.941704 systemd[1]: Started sshd@31-188.245.48.63:22-139.178.68.195:50844.service - OpenSSH per-connection server daemon (139.178.68.195:50844). Oct 9 02:54:07.943074 sshd[6897]: Accepted publickey for core from 139.178.68.195 port 50844 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:07.945262 sshd[6897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:07.950357 systemd-logind[1493]: New session 30 of user core. Oct 9 02:54:07.955604 systemd[1]: Started session-30.scope - Session 30 of User core. Oct 9 02:54:08.697083 sshd[6897]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:08.700159 systemd[1]: sshd@31-188.245.48.63:22-139.178.68.195:50844.service: Deactivated successfully. Oct 9 02:54:08.702555 systemd[1]: session-30.scope: Deactivated successfully. Oct 9 02:54:08.704353 systemd-logind[1493]: Session 30 logged out. Waiting for processes to exit. Oct 9 02:54:08.705873 systemd-logind[1493]: Removed session 30. Oct 9 02:54:13.533166 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.1X4XXO.mount: Deactivated successfully. Oct 9 02:54:13.883855 systemd[1]: Started sshd@32-188.245.48.63:22-139.178.68.195:58098.service - OpenSSH per-connection server daemon (139.178.68.195:58098). Oct 9 02:54:14.921534 sshd[6936]: Accepted publickey for core from 139.178.68.195 port 58098 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:14.924900 sshd[6936]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:14.929671 systemd-logind[1493]: New session 31 of user core. Oct 9 02:54:14.936644 systemd[1]: Started session-31.scope - Session 31 of User core. Oct 9 02:54:15.705902 sshd[6936]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:15.709242 systemd[1]: sshd@32-188.245.48.63:22-139.178.68.195:58098.service: Deactivated successfully. Oct 9 02:54:15.711588 systemd[1]: session-31.scope: Deactivated successfully. Oct 9 02:54:15.713610 systemd-logind[1493]: Session 31 logged out. Waiting for processes to exit. Oct 9 02:54:15.714988 systemd-logind[1493]: Removed session 31. Oct 9 02:54:17.253571 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.yt4fIW.mount: Deactivated successfully. Oct 9 02:54:20.886826 systemd[1]: Started sshd@33-188.245.48.63:22-139.178.68.195:36032.service - OpenSSH per-connection server daemon (139.178.68.195:36032). Oct 9 02:54:21.884174 sshd[6979]: Accepted publickey for core from 139.178.68.195 port 36032 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:21.886599 sshd[6979]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:21.891906 systemd-logind[1493]: New session 32 of user core. Oct 9 02:54:21.898562 systemd[1]: Started session-32.scope - Session 32 of User core. Oct 9 02:54:22.661678 sshd[6979]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:22.666896 systemd[1]: sshd@33-188.245.48.63:22-139.178.68.195:36032.service: Deactivated successfully. Oct 9 02:54:22.669462 systemd[1]: session-32.scope: Deactivated successfully. Oct 9 02:54:22.670287 systemd-logind[1493]: Session 32 logged out. Waiting for processes to exit. Oct 9 02:54:22.671655 systemd-logind[1493]: Removed session 32. Oct 9 02:54:27.834682 systemd[1]: Started sshd@34-188.245.48.63:22-139.178.68.195:36046.service - OpenSSH per-connection server daemon (139.178.68.195:36046). Oct 9 02:54:28.834357 sshd[6996]: Accepted publickey for core from 139.178.68.195 port 36046 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:28.836566 sshd[6996]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:28.842490 systemd-logind[1493]: New session 33 of user core. Oct 9 02:54:28.845601 systemd[1]: Started session-33.scope - Session 33 of User core. Oct 9 02:54:29.601346 sshd[6996]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:29.604462 systemd[1]: sshd@34-188.245.48.63:22-139.178.68.195:36046.service: Deactivated successfully. Oct 9 02:54:29.606852 systemd[1]: session-33.scope: Deactivated successfully. Oct 9 02:54:29.608873 systemd-logind[1493]: Session 33 logged out. Waiting for processes to exit. Oct 9 02:54:29.610094 systemd-logind[1493]: Removed session 33. Oct 9 02:54:34.777791 systemd[1]: Started sshd@35-188.245.48.63:22-139.178.68.195:59184.service - OpenSSH per-connection server daemon (139.178.68.195:59184). Oct 9 02:54:35.789595 sshd[7015]: Accepted publickey for core from 139.178.68.195 port 59184 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:35.792045 sshd[7015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:35.797977 systemd-logind[1493]: New session 34 of user core. Oct 9 02:54:35.802568 systemd[1]: Started session-34.scope - Session 34 of User core. Oct 9 02:54:36.690289 sshd[7015]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:36.694028 systemd[1]: sshd@35-188.245.48.63:22-139.178.68.195:59184.service: Deactivated successfully. Oct 9 02:54:36.696840 systemd[1]: session-34.scope: Deactivated successfully. Oct 9 02:54:36.699160 systemd-logind[1493]: Session 34 logged out. Waiting for processes to exit. Oct 9 02:54:36.700600 systemd-logind[1493]: Removed session 34. Oct 9 02:54:41.865696 systemd[1]: Started sshd@36-188.245.48.63:22-139.178.68.195:39600.service - OpenSSH per-connection server daemon (139.178.68.195:39600). Oct 9 02:54:42.856538 sshd[7056]: Accepted publickey for core from 139.178.68.195 port 39600 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:42.858409 sshd[7056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:42.863568 systemd-logind[1493]: New session 35 of user core. Oct 9 02:54:42.872628 systemd[1]: Started session-35.scope - Session 35 of User core. Oct 9 02:54:43.532221 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.9A57sz.mount: Deactivated successfully. Oct 9 02:54:43.622543 sshd[7056]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:43.625654 systemd[1]: sshd@36-188.245.48.63:22-139.178.68.195:39600.service: Deactivated successfully. Oct 9 02:54:43.627729 systemd[1]: session-35.scope: Deactivated successfully. Oct 9 02:54:43.629030 systemd-logind[1493]: Session 35 logged out. Waiting for processes to exit. Oct 9 02:54:43.630677 systemd-logind[1493]: Removed session 35. Oct 9 02:54:48.802791 systemd[1]: Started sshd@37-188.245.48.63:22-139.178.68.195:39608.service - OpenSSH per-connection server daemon (139.178.68.195:39608). Oct 9 02:54:49.792777 sshd[7113]: Accepted publickey for core from 139.178.68.195 port 39608 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:49.794581 sshd[7113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:49.799995 systemd-logind[1493]: New session 36 of user core. Oct 9 02:54:49.804583 systemd[1]: Started session-36.scope - Session 36 of User core. Oct 9 02:54:50.582004 sshd[7113]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:50.587875 systemd-logind[1493]: Session 36 logged out. Waiting for processes to exit. Oct 9 02:54:50.589394 systemd[1]: sshd@37-188.245.48.63:22-139.178.68.195:39608.service: Deactivated successfully. Oct 9 02:54:50.593401 systemd[1]: session-36.scope: Deactivated successfully. Oct 9 02:54:50.595780 systemd-logind[1493]: Removed session 36. Oct 9 02:54:55.759698 systemd[1]: Started sshd@38-188.245.48.63:22-139.178.68.195:33664.service - OpenSSH per-connection server daemon (139.178.68.195:33664). Oct 9 02:54:56.759923 sshd[7134]: Accepted publickey for core from 139.178.68.195 port 33664 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:54:56.761687 sshd[7134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:54:56.766859 systemd-logind[1493]: New session 37 of user core. Oct 9 02:54:56.774572 systemd[1]: Started session-37.scope - Session 37 of User core. Oct 9 02:54:57.518199 sshd[7134]: pam_unix(sshd:session): session closed for user core Oct 9 02:54:57.522896 systemd[1]: sshd@38-188.245.48.63:22-139.178.68.195:33664.service: Deactivated successfully. Oct 9 02:54:57.525742 systemd[1]: session-37.scope: Deactivated successfully. Oct 9 02:54:57.526792 systemd-logind[1493]: Session 37 logged out. Waiting for processes to exit. Oct 9 02:54:57.528596 systemd-logind[1493]: Removed session 37. Oct 9 02:55:02.696723 systemd[1]: Started sshd@39-188.245.48.63:22-139.178.68.195:59836.service - OpenSSH per-connection server daemon (139.178.68.195:59836). Oct 9 02:55:03.687156 sshd[7153]: Accepted publickey for core from 139.178.68.195 port 59836 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:03.688919 sshd[7153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:03.693898 systemd-logind[1493]: New session 38 of user core. Oct 9 02:55:03.697582 systemd[1]: Started session-38.scope - Session 38 of User core. Oct 9 02:55:04.455408 sshd[7153]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:04.459213 systemd[1]: sshd@39-188.245.48.63:22-139.178.68.195:59836.service: Deactivated successfully. Oct 9 02:55:04.461919 systemd[1]: session-38.scope: Deactivated successfully. Oct 9 02:55:04.463648 systemd-logind[1493]: Session 38 logged out. Waiting for processes to exit. Oct 9 02:55:04.465008 systemd-logind[1493]: Removed session 38. Oct 9 02:55:09.634874 systemd[1]: Started sshd@40-188.245.48.63:22-139.178.68.195:59850.service - OpenSSH per-connection server daemon (139.178.68.195:59850). Oct 9 02:55:10.636717 sshd[7167]: Accepted publickey for core from 139.178.68.195 port 59850 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:10.638342 sshd[7167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:10.644191 systemd-logind[1493]: New session 39 of user core. Oct 9 02:55:10.646615 systemd[1]: Started session-39.scope - Session 39 of User core. Oct 9 02:55:11.406731 sshd[7167]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:11.411566 systemd[1]: sshd@40-188.245.48.63:22-139.178.68.195:59850.service: Deactivated successfully. Oct 9 02:55:11.414056 systemd[1]: session-39.scope: Deactivated successfully. Oct 9 02:55:11.414859 systemd-logind[1493]: Session 39 logged out. Waiting for processes to exit. Oct 9 02:55:11.416333 systemd-logind[1493]: Removed session 39. Oct 9 02:55:13.535394 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.3I73l3.mount: Deactivated successfully. Oct 9 02:55:16.585184 systemd[1]: Started sshd@41-188.245.48.63:22-139.178.68.195:36890.service - OpenSSH per-connection server daemon (139.178.68.195:36890). Oct 9 02:55:17.244272 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.rOFCME.mount: Deactivated successfully. Oct 9 02:55:17.574127 sshd[7217]: Accepted publickey for core from 139.178.68.195 port 36890 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:17.575966 sshd[7217]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:17.580898 systemd-logind[1493]: New session 40 of user core. Oct 9 02:55:17.586617 systemd[1]: Started session-40.scope - Session 40 of User core. Oct 9 02:55:18.424631 sshd[7217]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:18.429002 systemd[1]: sshd@41-188.245.48.63:22-139.178.68.195:36890.service: Deactivated successfully. Oct 9 02:55:18.431419 systemd[1]: session-40.scope: Deactivated successfully. Oct 9 02:55:18.432364 systemd-logind[1493]: Session 40 logged out. Waiting for processes to exit. Oct 9 02:55:18.433906 systemd-logind[1493]: Removed session 40. Oct 9 02:55:23.598160 systemd[1]: Started sshd@42-188.245.48.63:22-139.178.68.195:50854.service - OpenSSH per-connection server daemon (139.178.68.195:50854). Oct 9 02:55:24.603995 sshd[7258]: Accepted publickey for core from 139.178.68.195 port 50854 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:24.605666 sshd[7258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:24.610377 systemd-logind[1493]: New session 41 of user core. Oct 9 02:55:24.616566 systemd[1]: Started session-41.scope - Session 41 of User core. Oct 9 02:55:25.347412 sshd[7258]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:25.351706 systemd[1]: sshd@42-188.245.48.63:22-139.178.68.195:50854.service: Deactivated successfully. Oct 9 02:55:25.353880 systemd[1]: session-41.scope: Deactivated successfully. Oct 9 02:55:25.355013 systemd-logind[1493]: Session 41 logged out. Waiting for processes to exit. Oct 9 02:55:25.356057 systemd-logind[1493]: Removed session 41. Oct 9 02:55:30.525057 systemd[1]: Started sshd@43-188.245.48.63:22-139.178.68.195:50862.service - OpenSSH per-connection server daemon (139.178.68.195:50862). Oct 9 02:55:31.554053 sshd[7274]: Accepted publickey for core from 139.178.68.195 port 50862 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:31.556399 sshd[7274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:31.561593 systemd-logind[1493]: New session 42 of user core. Oct 9 02:55:31.568587 systemd[1]: Started session-42.scope - Session 42 of User core. Oct 9 02:55:32.323954 sshd[7274]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:32.327945 systemd[1]: sshd@43-188.245.48.63:22-139.178.68.195:50862.service: Deactivated successfully. Oct 9 02:55:32.329999 systemd[1]: session-42.scope: Deactivated successfully. Oct 9 02:55:32.330703 systemd-logind[1493]: Session 42 logged out. Waiting for processes to exit. Oct 9 02:55:32.332256 systemd-logind[1493]: Removed session 42. Oct 9 02:55:37.505980 systemd[1]: Started sshd@44-188.245.48.63:22-139.178.68.195:59340.service - OpenSSH per-connection server daemon (139.178.68.195:59340). Oct 9 02:55:38.520272 sshd[7312]: Accepted publickey for core from 139.178.68.195 port 59340 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:38.523181 sshd[7312]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:38.532230 systemd-logind[1493]: New session 43 of user core. Oct 9 02:55:38.540659 systemd[1]: Started session-43.scope - Session 43 of User core. Oct 9 02:55:39.291271 sshd[7312]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:39.297314 systemd[1]: sshd@44-188.245.48.63:22-139.178.68.195:59340.service: Deactivated successfully. Oct 9 02:55:39.300153 systemd[1]: session-43.scope: Deactivated successfully. Oct 9 02:55:39.301426 systemd-logind[1493]: Session 43 logged out. Waiting for processes to exit. Oct 9 02:55:39.303205 systemd-logind[1493]: Removed session 43. Oct 9 02:55:39.470042 systemd[1]: Started sshd@45-188.245.48.63:22-139.178.68.195:59346.service - OpenSSH per-connection server daemon (139.178.68.195:59346). Oct 9 02:55:40.465597 sshd[7326]: Accepted publickey for core from 139.178.68.195 port 59346 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:40.467420 sshd[7326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:40.472769 systemd-logind[1493]: New session 44 of user core. Oct 9 02:55:40.477562 systemd[1]: Started session-44.scope - Session 44 of User core. Oct 9 02:55:41.315001 sshd[7326]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:41.318915 systemd-logind[1493]: Session 44 logged out. Waiting for processes to exit. Oct 9 02:55:41.319872 systemd[1]: sshd@45-188.245.48.63:22-139.178.68.195:59346.service: Deactivated successfully. Oct 9 02:55:41.322218 systemd[1]: session-44.scope: Deactivated successfully. Oct 9 02:55:41.323487 systemd-logind[1493]: Removed session 44. Oct 9 02:55:41.490648 systemd[1]: Started sshd@46-188.245.48.63:22-139.178.68.195:51134.service - OpenSSH per-connection server daemon (139.178.68.195:51134). Oct 9 02:55:42.504301 sshd[7337]: Accepted publickey for core from 139.178.68.195 port 51134 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:42.506050 sshd[7337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:42.510864 systemd-logind[1493]: New session 45 of user core. Oct 9 02:55:42.516602 systemd[1]: Started session-45.scope - Session 45 of User core. Oct 9 02:55:43.279663 sshd[7337]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:43.284989 systemd[1]: sshd@46-188.245.48.63:22-139.178.68.195:51134.service: Deactivated successfully. Oct 9 02:55:43.286958 systemd[1]: session-45.scope: Deactivated successfully. Oct 9 02:55:43.288326 systemd-logind[1493]: Session 45 logged out. Waiting for processes to exit. Oct 9 02:55:43.289989 systemd-logind[1493]: Removed session 45. Oct 9 02:55:43.536251 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.fEUV0L.mount: Deactivated successfully. Oct 9 02:55:48.461916 systemd[1]: Started sshd@47-188.245.48.63:22-139.178.68.195:51136.service - OpenSSH per-connection server daemon (139.178.68.195:51136). Oct 9 02:55:49.512736 sshd[7405]: Accepted publickey for core from 139.178.68.195 port 51136 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:49.518500 sshd[7405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:49.526652 systemd-logind[1493]: New session 46 of user core. Oct 9 02:55:49.535670 systemd[1]: Started session-46.scope - Session 46 of User core. Oct 9 02:55:50.523993 sshd[7405]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:50.530799 systemd[1]: sshd@47-188.245.48.63:22-139.178.68.195:51136.service: Deactivated successfully. Oct 9 02:55:50.534778 systemd[1]: session-46.scope: Deactivated successfully. Oct 9 02:55:50.535982 systemd-logind[1493]: Session 46 logged out. Waiting for processes to exit. Oct 9 02:55:50.537740 systemd-logind[1493]: Removed session 46. Oct 9 02:55:55.709732 systemd[1]: Started sshd@48-188.245.48.63:22-139.178.68.195:45994.service - OpenSSH per-connection server daemon (139.178.68.195:45994). Oct 9 02:55:56.728572 sshd[7422]: Accepted publickey for core from 139.178.68.195 port 45994 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:55:56.731501 sshd[7422]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:55:56.737145 systemd-logind[1493]: New session 47 of user core. Oct 9 02:55:56.742602 systemd[1]: Started session-47.scope - Session 47 of User core. Oct 9 02:55:57.505662 sshd[7422]: pam_unix(sshd:session): session closed for user core Oct 9 02:55:57.509822 systemd[1]: sshd@48-188.245.48.63:22-139.178.68.195:45994.service: Deactivated successfully. Oct 9 02:55:57.512185 systemd[1]: session-47.scope: Deactivated successfully. Oct 9 02:55:57.513268 systemd-logind[1493]: Session 47 logged out. Waiting for processes to exit. Oct 9 02:55:57.515017 systemd-logind[1493]: Removed session 47. Oct 9 02:56:02.681462 systemd[1]: Started sshd@49-188.245.48.63:22-139.178.68.195:33832.service - OpenSSH per-connection server daemon (139.178.68.195:33832). Oct 9 02:56:03.687284 sshd[7437]: Accepted publickey for core from 139.178.68.195 port 33832 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:03.689182 sshd[7437]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:03.694410 systemd-logind[1493]: New session 48 of user core. Oct 9 02:56:03.698589 systemd[1]: Started session-48.scope - Session 48 of User core. Oct 9 02:56:04.436867 sshd[7437]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:04.441238 systemd[1]: sshd@49-188.245.48.63:22-139.178.68.195:33832.service: Deactivated successfully. Oct 9 02:56:04.443988 systemd[1]: session-48.scope: Deactivated successfully. Oct 9 02:56:04.444783 systemd-logind[1493]: Session 48 logged out. Waiting for processes to exit. Oct 9 02:56:04.445902 systemd-logind[1493]: Removed session 48. Oct 9 02:56:09.613938 systemd[1]: Started sshd@50-188.245.48.63:22-139.178.68.195:33846.service - OpenSSH per-connection server daemon (139.178.68.195:33846). Oct 9 02:56:10.601719 sshd[7455]: Accepted publickey for core from 139.178.68.195 port 33846 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:10.603700 sshd[7455]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:10.608619 systemd-logind[1493]: New session 49 of user core. Oct 9 02:56:10.613639 systemd[1]: Started session-49.scope - Session 49 of User core. Oct 9 02:56:11.398004 sshd[7455]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:11.403304 systemd[1]: sshd@50-188.245.48.63:22-139.178.68.195:33846.service: Deactivated successfully. Oct 9 02:56:11.407768 systemd[1]: session-49.scope: Deactivated successfully. Oct 9 02:56:11.410561 systemd-logind[1493]: Session 49 logged out. Waiting for processes to exit. Oct 9 02:56:11.413411 systemd-logind[1493]: Removed session 49. Oct 9 02:56:16.574375 systemd[1]: Started sshd@51-188.245.48.63:22-139.178.68.195:51268.service - OpenSSH per-connection server daemon (139.178.68.195:51268). Oct 9 02:56:17.620501 sshd[7495]: Accepted publickey for core from 139.178.68.195 port 51268 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:17.626831 sshd[7495]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:17.631818 systemd-logind[1493]: New session 50 of user core. Oct 9 02:56:17.635540 systemd[1]: Started session-50.scope - Session 50 of User core. Oct 9 02:56:18.474012 sshd[7495]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:18.479528 systemd[1]: sshd@51-188.245.48.63:22-139.178.68.195:51268.service: Deactivated successfully. Oct 9 02:56:18.484879 systemd[1]: session-50.scope: Deactivated successfully. Oct 9 02:56:18.488239 systemd-logind[1493]: Session 50 logged out. Waiting for processes to exit. Oct 9 02:56:18.491108 systemd-logind[1493]: Removed session 50. Oct 9 02:56:23.649780 systemd[1]: Started sshd@52-188.245.48.63:22-139.178.68.195:36804.service - OpenSSH per-connection server daemon (139.178.68.195:36804). Oct 9 02:56:24.664253 sshd[7530]: Accepted publickey for core from 139.178.68.195 port 36804 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:24.666925 sshd[7530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:24.673227 systemd-logind[1493]: New session 51 of user core. Oct 9 02:56:24.678626 systemd[1]: Started session-51.scope - Session 51 of User core. Oct 9 02:56:25.424766 sshd[7530]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:25.429508 systemd[1]: sshd@52-188.245.48.63:22-139.178.68.195:36804.service: Deactivated successfully. Oct 9 02:56:25.433033 systemd[1]: session-51.scope: Deactivated successfully. Oct 9 02:56:25.433868 systemd-logind[1493]: Session 51 logged out. Waiting for processes to exit. Oct 9 02:56:25.435116 systemd-logind[1493]: Removed session 51. Oct 9 02:56:30.607731 systemd[1]: Started sshd@53-188.245.48.63:22-139.178.68.195:36810.service - OpenSSH per-connection server daemon (139.178.68.195:36810). Oct 9 02:56:31.627269 sshd[7550]: Accepted publickey for core from 139.178.68.195 port 36810 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:31.629342 sshd[7550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:31.635215 systemd-logind[1493]: New session 52 of user core. Oct 9 02:56:31.640624 systemd[1]: Started session-52.scope - Session 52 of User core. Oct 9 02:56:32.451254 sshd[7550]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:32.454485 systemd[1]: sshd@53-188.245.48.63:22-139.178.68.195:36810.service: Deactivated successfully. Oct 9 02:56:32.456835 systemd[1]: session-52.scope: Deactivated successfully. Oct 9 02:56:32.458752 systemd-logind[1493]: Session 52 logged out. Waiting for processes to exit. Oct 9 02:56:32.460257 systemd-logind[1493]: Removed session 52. Oct 9 02:56:35.369087 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.WmgYwg.mount: Deactivated successfully. Oct 9 02:56:37.631719 systemd[1]: Started sshd@54-188.245.48.63:22-139.178.68.195:44860.service - OpenSSH per-connection server daemon (139.178.68.195:44860). Oct 9 02:56:38.648171 sshd[7588]: Accepted publickey for core from 139.178.68.195 port 44860 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:38.651687 sshd[7588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:38.656723 systemd-logind[1493]: New session 53 of user core. Oct 9 02:56:38.665560 systemd[1]: Started session-53.scope - Session 53 of User core. Oct 9 02:56:39.506505 sshd[7588]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:39.510779 systemd[1]: sshd@54-188.245.48.63:22-139.178.68.195:44860.service: Deactivated successfully. Oct 9 02:56:39.513357 systemd[1]: session-53.scope: Deactivated successfully. Oct 9 02:56:39.515757 systemd-logind[1493]: Session 53 logged out. Waiting for processes to exit. Oct 9 02:56:39.517373 systemd-logind[1493]: Removed session 53. Oct 9 02:56:43.538859 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.sro1HS.mount: Deactivated successfully. Oct 9 02:56:44.682687 systemd[1]: Started sshd@55-188.245.48.63:22-139.178.68.195:50912.service - OpenSSH per-connection server daemon (139.178.68.195:50912). Oct 9 02:56:45.675709 sshd[7622]: Accepted publickey for core from 139.178.68.195 port 50912 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:45.677473 sshd[7622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:45.681987 systemd-logind[1493]: New session 54 of user core. Oct 9 02:56:45.693641 systemd[1]: Started session-54.scope - Session 54 of User core. Oct 9 02:56:46.443962 sshd[7622]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:46.448818 systemd[1]: sshd@55-188.245.48.63:22-139.178.68.195:50912.service: Deactivated successfully. Oct 9 02:56:46.451782 systemd[1]: session-54.scope: Deactivated successfully. Oct 9 02:56:46.452719 systemd-logind[1493]: Session 54 logged out. Waiting for processes to exit. Oct 9 02:56:46.454150 systemd-logind[1493]: Removed session 54. Oct 9 02:56:51.621948 systemd[1]: Started sshd@56-188.245.48.63:22-139.178.68.195:55522.service - OpenSSH per-connection server daemon (139.178.68.195:55522). Oct 9 02:56:52.636655 sshd[7676]: Accepted publickey for core from 139.178.68.195 port 55522 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:52.638707 sshd[7676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:52.643880 systemd-logind[1493]: New session 55 of user core. Oct 9 02:56:52.651565 systemd[1]: Started session-55.scope - Session 55 of User core. Oct 9 02:56:53.408158 sshd[7676]: pam_unix(sshd:session): session closed for user core Oct 9 02:56:53.412209 systemd[1]: sshd@56-188.245.48.63:22-139.178.68.195:55522.service: Deactivated successfully. Oct 9 02:56:53.414625 systemd[1]: session-55.scope: Deactivated successfully. Oct 9 02:56:53.415417 systemd-logind[1493]: Session 55 logged out. Waiting for processes to exit. Oct 9 02:56:53.416633 systemd-logind[1493]: Removed session 55. Oct 9 02:56:58.579554 systemd[1]: Started sshd@57-188.245.48.63:22-139.178.68.195:55534.service - OpenSSH per-connection server daemon (139.178.68.195:55534). Oct 9 02:56:59.597008 sshd[7696]: Accepted publickey for core from 139.178.68.195 port 55534 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:56:59.600705 sshd[7696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:56:59.605511 systemd-logind[1493]: New session 56 of user core. Oct 9 02:56:59.611573 systemd[1]: Started session-56.scope - Session 56 of User core. Oct 9 02:57:00.589269 sshd[7696]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:00.593892 systemd-logind[1493]: Session 56 logged out. Waiting for processes to exit. Oct 9 02:57:00.595047 systemd[1]: sshd@57-188.245.48.63:22-139.178.68.195:55534.service: Deactivated successfully. Oct 9 02:57:00.598217 systemd[1]: session-56.scope: Deactivated successfully. Oct 9 02:57:00.600233 systemd-logind[1493]: Removed session 56. Oct 9 02:57:05.768265 systemd[1]: Started sshd@58-188.245.48.63:22-139.178.68.195:58192.service - OpenSSH per-connection server daemon (139.178.68.195:58192). Oct 9 02:57:06.758681 sshd[7709]: Accepted publickey for core from 139.178.68.195 port 58192 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:06.760294 sshd[7709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:06.764913 systemd-logind[1493]: New session 57 of user core. Oct 9 02:57:06.769569 systemd[1]: Started session-57.scope - Session 57 of User core. Oct 9 02:57:07.517795 sshd[7709]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:07.520910 systemd[1]: sshd@58-188.245.48.63:22-139.178.68.195:58192.service: Deactivated successfully. Oct 9 02:57:07.522997 systemd[1]: session-57.scope: Deactivated successfully. Oct 9 02:57:07.524919 systemd-logind[1493]: Session 57 logged out. Waiting for processes to exit. Oct 9 02:57:07.526192 systemd-logind[1493]: Removed session 57. Oct 9 02:57:12.690150 systemd[1]: Started sshd@59-188.245.48.63:22-139.178.68.195:53712.service - OpenSSH per-connection server daemon (139.178.68.195:53712). Oct 9 02:57:13.692248 sshd[7728]: Accepted publickey for core from 139.178.68.195 port 53712 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:13.694044 sshd[7728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:13.698761 systemd-logind[1493]: New session 58 of user core. Oct 9 02:57:13.704566 systemd[1]: Started session-58.scope - Session 58 of User core. Oct 9 02:57:14.494013 sshd[7728]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:14.498323 systemd[1]: sshd@59-188.245.48.63:22-139.178.68.195:53712.service: Deactivated successfully. Oct 9 02:57:14.500745 systemd[1]: session-58.scope: Deactivated successfully. Oct 9 02:57:14.501861 systemd-logind[1493]: Session 58 logged out. Waiting for processes to exit. Oct 9 02:57:14.503040 systemd-logind[1493]: Removed session 58. Oct 9 02:57:17.248158 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.gKjg9a.mount: Deactivated successfully. Oct 9 02:57:19.673774 systemd[1]: Started sshd@60-188.245.48.63:22-139.178.68.195:53722.service - OpenSSH per-connection server daemon (139.178.68.195:53722). Oct 9 02:57:20.668198 sshd[7786]: Accepted publickey for core from 139.178.68.195 port 53722 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:20.670742 sshd[7786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:20.676002 systemd-logind[1493]: New session 59 of user core. Oct 9 02:57:20.679572 systemd[1]: Started session-59.scope - Session 59 of User core. Oct 9 02:57:21.464133 sshd[7786]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:21.468301 systemd-logind[1493]: Session 59 logged out. Waiting for processes to exit. Oct 9 02:57:21.469126 systemd[1]: sshd@60-188.245.48.63:22-139.178.68.195:53722.service: Deactivated successfully. Oct 9 02:57:21.471418 systemd[1]: session-59.scope: Deactivated successfully. Oct 9 02:57:21.472835 systemd-logind[1493]: Removed session 59. Oct 9 02:57:26.635894 systemd[1]: Started sshd@61-188.245.48.63:22-139.178.68.195:32884.service - OpenSSH per-connection server daemon (139.178.68.195:32884). Oct 9 02:57:27.634528 sshd[7802]: Accepted publickey for core from 139.178.68.195 port 32884 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:27.636116 sshd[7802]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:27.640997 systemd-logind[1493]: New session 60 of user core. Oct 9 02:57:27.647936 systemd[1]: Started session-60.scope - Session 60 of User core. Oct 9 02:57:28.478358 sshd[7802]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:28.483198 systemd[1]: sshd@61-188.245.48.63:22-139.178.68.195:32884.service: Deactivated successfully. Oct 9 02:57:28.485709 systemd[1]: session-60.scope: Deactivated successfully. Oct 9 02:57:28.487019 systemd-logind[1493]: Session 60 logged out. Waiting for processes to exit. Oct 9 02:57:28.488854 systemd-logind[1493]: Removed session 60. Oct 9 02:57:33.650423 systemd[1]: Started sshd@62-188.245.48.63:22-139.178.68.195:47738.service - OpenSSH per-connection server daemon (139.178.68.195:47738). Oct 9 02:57:34.663038 sshd[7819]: Accepted publickey for core from 139.178.68.195 port 47738 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:34.664944 sshd[7819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:34.670059 systemd-logind[1493]: New session 61 of user core. Oct 9 02:57:34.674593 systemd[1]: Started session-61.scope - Session 61 of User core. Oct 9 02:57:35.364813 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.LV5cIL.mount: Deactivated successfully. Oct 9 02:57:35.427036 sshd[7819]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:35.431564 systemd[1]: sshd@62-188.245.48.63:22-139.178.68.195:47738.service: Deactivated successfully. Oct 9 02:57:35.434065 systemd[1]: session-61.scope: Deactivated successfully. Oct 9 02:57:35.434824 systemd-logind[1493]: Session 61 logged out. Waiting for processes to exit. Oct 9 02:57:35.435797 systemd-logind[1493]: Removed session 61. Oct 9 02:57:40.597481 systemd[1]: Started sshd@63-188.245.48.63:22-139.178.68.195:47748.service - OpenSSH per-connection server daemon (139.178.68.195:47748). Oct 9 02:57:41.590839 sshd[7856]: Accepted publickey for core from 139.178.68.195 port 47748 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:41.593771 sshd[7856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:41.602391 systemd-logind[1493]: New session 62 of user core. Oct 9 02:57:41.605750 systemd[1]: Started session-62.scope - Session 62 of User core. Oct 9 02:57:42.346129 sshd[7856]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:42.350346 systemd[1]: sshd@63-188.245.48.63:22-139.178.68.195:47748.service: Deactivated successfully. Oct 9 02:57:42.352955 systemd[1]: session-62.scope: Deactivated successfully. Oct 9 02:57:42.354847 systemd-logind[1493]: Session 62 logged out. Waiting for processes to exit. Oct 9 02:57:42.356049 systemd-logind[1493]: Removed session 62. Oct 9 02:57:47.249109 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.tvfCJG.mount: Deactivated successfully. Oct 9 02:57:47.525936 systemd[1]: Started sshd@64-188.245.48.63:22-139.178.68.195:36038.service - OpenSSH per-connection server daemon (139.178.68.195:36038). Oct 9 02:57:48.537616 sshd[7916]: Accepted publickey for core from 139.178.68.195 port 36038 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:48.540707 sshd[7916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:48.550864 systemd-logind[1493]: New session 63 of user core. Oct 9 02:57:48.557647 systemd[1]: Started session-63.scope - Session 63 of User core. Oct 9 02:57:49.324153 sshd[7916]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:49.327355 systemd[1]: sshd@64-188.245.48.63:22-139.178.68.195:36038.service: Deactivated successfully. Oct 9 02:57:49.329657 systemd[1]: session-63.scope: Deactivated successfully. Oct 9 02:57:49.331487 systemd-logind[1493]: Session 63 logged out. Waiting for processes to exit. Oct 9 02:57:49.333056 systemd-logind[1493]: Removed session 63. Oct 9 02:57:54.513166 systemd[1]: Started sshd@65-188.245.48.63:22-139.178.68.195:34390.service - OpenSSH per-connection server daemon (139.178.68.195:34390). Oct 9 02:57:55.528062 sshd[7935]: Accepted publickey for core from 139.178.68.195 port 34390 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:57:55.530165 sshd[7935]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:57:55.536145 systemd-logind[1493]: New session 64 of user core. Oct 9 02:57:55.540655 systemd[1]: Started session-64.scope - Session 64 of User core. Oct 9 02:57:56.291009 sshd[7935]: pam_unix(sshd:session): session closed for user core Oct 9 02:57:56.294646 systemd[1]: sshd@65-188.245.48.63:22-139.178.68.195:34390.service: Deactivated successfully. Oct 9 02:57:56.296923 systemd[1]: session-64.scope: Deactivated successfully. Oct 9 02:57:56.299233 systemd-logind[1493]: Session 64 logged out. Waiting for processes to exit. Oct 9 02:57:56.300595 systemd-logind[1493]: Removed session 64. Oct 9 02:58:01.462501 systemd[1]: Started sshd@66-188.245.48.63:22-139.178.68.195:59188.service - OpenSSH per-connection server daemon (139.178.68.195:59188). Oct 9 02:58:02.470817 sshd[7955]: Accepted publickey for core from 139.178.68.195 port 59188 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:02.472678 sshd[7955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:02.478057 systemd-logind[1493]: New session 65 of user core. Oct 9 02:58:02.485597 systemd[1]: Started session-65.scope - Session 65 of User core. Oct 9 02:58:03.378272 sshd[7955]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:03.383234 systemd[1]: sshd@66-188.245.48.63:22-139.178.68.195:59188.service: Deactivated successfully. Oct 9 02:58:03.385666 systemd[1]: session-65.scope: Deactivated successfully. Oct 9 02:58:03.386326 systemd-logind[1493]: Session 65 logged out. Waiting for processes to exit. Oct 9 02:58:03.387756 systemd-logind[1493]: Removed session 65. Oct 9 02:58:08.551781 systemd[1]: Started sshd@67-188.245.48.63:22-139.178.68.195:59194.service - OpenSSH per-connection server daemon (139.178.68.195:59194). Oct 9 02:58:09.549590 sshd[7974]: Accepted publickey for core from 139.178.68.195 port 59194 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:09.551902 sshd[7974]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:09.557127 systemd-logind[1493]: New session 66 of user core. Oct 9 02:58:09.562562 systemd[1]: Started session-66.scope - Session 66 of User core. Oct 9 02:58:10.340619 sshd[7974]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:10.344662 systemd[1]: sshd@67-188.245.48.63:22-139.178.68.195:59194.service: Deactivated successfully. Oct 9 02:58:10.347088 systemd[1]: session-66.scope: Deactivated successfully. Oct 9 02:58:10.348365 systemd-logind[1493]: Session 66 logged out. Waiting for processes to exit. Oct 9 02:58:10.350084 systemd-logind[1493]: Removed session 66. Oct 9 02:58:15.512876 systemd[1]: Started sshd@68-188.245.48.63:22-139.178.68.195:37762.service - OpenSSH per-connection server daemon (139.178.68.195:37762). Oct 9 02:58:16.528543 sshd[8012]: Accepted publickey for core from 139.178.68.195 port 37762 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:16.531542 sshd[8012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:16.539544 systemd-logind[1493]: New session 67 of user core. Oct 9 02:58:16.546666 systemd[1]: Started session-67.scope - Session 67 of User core. Oct 9 02:58:17.340563 sshd[8012]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:17.347515 systemd[1]: sshd@68-188.245.48.63:22-139.178.68.195:37762.service: Deactivated successfully. Oct 9 02:58:17.349652 systemd[1]: session-67.scope: Deactivated successfully. Oct 9 02:58:17.351393 systemd-logind[1493]: Session 67 logged out. Waiting for processes to exit. Oct 9 02:58:17.352723 systemd-logind[1493]: Removed session 67. Oct 9 02:58:22.517244 systemd[1]: Started sshd@69-188.245.48.63:22-139.178.68.195:54442.service - OpenSSH per-connection server daemon (139.178.68.195:54442). Oct 9 02:58:23.522493 sshd[8052]: Accepted publickey for core from 139.178.68.195 port 54442 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:23.524995 sshd[8052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:23.529691 systemd-logind[1493]: New session 68 of user core. Oct 9 02:58:23.535561 systemd[1]: Started session-68.scope - Session 68 of User core. Oct 9 02:58:24.439901 sshd[8052]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:24.445828 systemd-logind[1493]: Session 68 logged out. Waiting for processes to exit. Oct 9 02:58:24.446071 systemd[1]: sshd@69-188.245.48.63:22-139.178.68.195:54442.service: Deactivated successfully. Oct 9 02:58:24.448811 systemd[1]: session-68.scope: Deactivated successfully. Oct 9 02:58:24.450379 systemd-logind[1493]: Removed session 68. Oct 9 02:58:29.621769 systemd[1]: Started sshd@70-188.245.48.63:22-139.178.68.195:54446.service - OpenSSH per-connection server daemon (139.178.68.195:54446). Oct 9 02:58:30.663793 sshd[8068]: Accepted publickey for core from 139.178.68.195 port 54446 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:30.666110 sshd[8068]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:30.671374 systemd-logind[1493]: New session 69 of user core. Oct 9 02:58:30.677621 systemd[1]: Started session-69.scope - Session 69 of User core. Oct 9 02:58:31.508991 sshd[8068]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:31.515788 systemd-logind[1493]: Session 69 logged out. Waiting for processes to exit. Oct 9 02:58:31.516721 systemd[1]: sshd@70-188.245.48.63:22-139.178.68.195:54446.service: Deactivated successfully. Oct 9 02:58:31.519369 systemd[1]: session-69.scope: Deactivated successfully. Oct 9 02:58:31.520748 systemd-logind[1493]: Removed session 69. Oct 9 02:58:35.363899 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.vkVhNU.mount: Deactivated successfully. Oct 9 02:58:36.683737 systemd[1]: Started sshd@71-188.245.48.63:22-139.178.68.195:40032.service - OpenSSH per-connection server daemon (139.178.68.195:40032). Oct 9 02:58:37.703941 sshd[8119]: Accepted publickey for core from 139.178.68.195 port 40032 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:37.706481 sshd[8119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:37.711495 systemd-logind[1493]: New session 70 of user core. Oct 9 02:58:37.716606 systemd[1]: Started session-70.scope - Session 70 of User core. Oct 9 02:58:38.511635 sshd[8119]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:38.514826 systemd[1]: sshd@71-188.245.48.63:22-139.178.68.195:40032.service: Deactivated successfully. Oct 9 02:58:38.516964 systemd[1]: session-70.scope: Deactivated successfully. Oct 9 02:58:38.519124 systemd-logind[1493]: Session 70 logged out. Waiting for processes to exit. Oct 9 02:58:38.520330 systemd-logind[1493]: Removed session 70. Oct 9 02:58:43.683717 systemd[1]: Started sshd@72-188.245.48.63:22-139.178.68.195:46570.service - OpenSSH per-connection server daemon (139.178.68.195:46570). Oct 9 02:58:44.683095 sshd[8158]: Accepted publickey for core from 139.178.68.195 port 46570 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:44.684885 sshd[8158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:44.689496 systemd-logind[1493]: New session 71 of user core. Oct 9 02:58:44.697642 systemd[1]: Started session-71.scope - Session 71 of User core. Oct 9 02:58:45.455816 sshd[8158]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:45.459878 systemd-logind[1493]: Session 71 logged out. Waiting for processes to exit. Oct 9 02:58:45.460828 systemd[1]: sshd@72-188.245.48.63:22-139.178.68.195:46570.service: Deactivated successfully. Oct 9 02:58:45.463234 systemd[1]: session-71.scope: Deactivated successfully. Oct 9 02:58:45.464468 systemd-logind[1493]: Removed session 71. Oct 9 02:58:50.634793 systemd[1]: Started sshd@73-188.245.48.63:22-139.178.68.195:46582.service - OpenSSH per-connection server daemon (139.178.68.195:46582). Oct 9 02:58:51.625018 sshd[8194]: Accepted publickey for core from 139.178.68.195 port 46582 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:51.626743 sshd[8194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:51.631175 systemd-logind[1493]: New session 72 of user core. Oct 9 02:58:51.635618 systemd[1]: Started session-72.scope - Session 72 of User core. Oct 9 02:58:52.383338 sshd[8194]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:52.387393 systemd[1]: sshd@73-188.245.48.63:22-139.178.68.195:46582.service: Deactivated successfully. Oct 9 02:58:52.390727 systemd[1]: session-72.scope: Deactivated successfully. Oct 9 02:58:52.393164 systemd-logind[1493]: Session 72 logged out. Waiting for processes to exit. Oct 9 02:58:52.395159 systemd-logind[1493]: Removed session 72. Oct 9 02:58:57.557689 systemd[1]: Started sshd@74-188.245.48.63:22-139.178.68.195:42416.service - OpenSSH per-connection server daemon (139.178.68.195:42416). Oct 9 02:58:57.565671 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Oct 9 02:58:57.599241 systemd-tmpfiles[8215]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 9 02:58:57.600259 systemd-tmpfiles[8215]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 9 02:58:57.601338 systemd-tmpfiles[8215]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 9 02:58:57.601612 systemd-tmpfiles[8215]: ACLs are not supported, ignoring. Oct 9 02:58:57.601681 systemd-tmpfiles[8215]: ACLs are not supported, ignoring. Oct 9 02:58:57.605534 systemd-tmpfiles[8215]: Detected autofs mount point /boot during canonicalization of boot. Oct 9 02:58:57.605544 systemd-tmpfiles[8215]: Skipping /boot Oct 9 02:58:57.614369 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Oct 9 02:58:57.614674 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Oct 9 02:58:58.555000 sshd[8214]: Accepted publickey for core from 139.178.68.195 port 42416 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:58:58.556613 sshd[8214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:58:58.560516 systemd-logind[1493]: New session 73 of user core. Oct 9 02:58:58.565555 systemd[1]: Started session-73.scope - Session 73 of User core. Oct 9 02:58:59.290339 sshd[8214]: pam_unix(sshd:session): session closed for user core Oct 9 02:58:59.294614 systemd-logind[1493]: Session 73 logged out. Waiting for processes to exit. Oct 9 02:58:59.295316 systemd[1]: sshd@74-188.245.48.63:22-139.178.68.195:42416.service: Deactivated successfully. Oct 9 02:58:59.297758 systemd[1]: session-73.scope: Deactivated successfully. Oct 9 02:58:59.299178 systemd-logind[1493]: Removed session 73. Oct 9 02:59:04.467687 systemd[1]: Started sshd@75-188.245.48.63:22-139.178.68.195:41732.service - OpenSSH per-connection server daemon (139.178.68.195:41732). Oct 9 02:59:05.461084 sshd[8235]: Accepted publickey for core from 139.178.68.195 port 41732 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:05.462888 sshd[8235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:05.467173 systemd-logind[1493]: New session 74 of user core. Oct 9 02:59:05.472568 systemd[1]: Started session-74.scope - Session 74 of User core. Oct 9 02:59:06.204532 sshd[8235]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:06.208126 systemd[1]: sshd@75-188.245.48.63:22-139.178.68.195:41732.service: Deactivated successfully. Oct 9 02:59:06.210555 systemd[1]: session-74.scope: Deactivated successfully. Oct 9 02:59:06.212951 systemd-logind[1493]: Session 74 logged out. Waiting for processes to exit. Oct 9 02:59:06.215158 systemd-logind[1493]: Removed session 74. Oct 9 02:59:11.379691 systemd[1]: Started sshd@76-188.245.48.63:22-139.178.68.195:40708.service - OpenSSH per-connection server daemon (139.178.68.195:40708). Oct 9 02:59:12.372363 sshd[8248]: Accepted publickey for core from 139.178.68.195 port 40708 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:12.374354 sshd[8248]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:12.383196 systemd-logind[1493]: New session 75 of user core. Oct 9 02:59:12.387810 systemd[1]: Started session-75.scope - Session 75 of User core. Oct 9 02:59:13.128973 sshd[8248]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:13.134380 systemd[1]: sshd@76-188.245.48.63:22-139.178.68.195:40708.service: Deactivated successfully. Oct 9 02:59:13.136303 systemd[1]: session-75.scope: Deactivated successfully. Oct 9 02:59:13.137223 systemd-logind[1493]: Session 75 logged out. Waiting for processes to exit. Oct 9 02:59:13.138379 systemd-logind[1493]: Removed session 75. Oct 9 02:59:18.313640 systemd[1]: Started sshd@77-188.245.48.63:22-139.178.68.195:40720.service - OpenSSH per-connection server daemon (139.178.68.195:40720). Oct 9 02:59:19.340484 sshd[8308]: Accepted publickey for core from 139.178.68.195 port 40720 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:19.346115 sshd[8308]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:19.359944 systemd-logind[1493]: New session 76 of user core. Oct 9 02:59:19.364729 systemd[1]: Started session-76.scope - Session 76 of User core. Oct 9 02:59:20.215330 sshd[8308]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:20.218079 systemd[1]: sshd@77-188.245.48.63:22-139.178.68.195:40720.service: Deactivated successfully. Oct 9 02:59:20.220101 systemd[1]: session-76.scope: Deactivated successfully. Oct 9 02:59:20.221385 systemd-logind[1493]: Session 76 logged out. Waiting for processes to exit. Oct 9 02:59:20.223056 systemd-logind[1493]: Removed session 76. Oct 9 02:59:25.388721 systemd[1]: Started sshd@78-188.245.48.63:22-139.178.68.195:60590.service - OpenSSH per-connection server daemon (139.178.68.195:60590). Oct 9 02:59:26.392850 sshd[8327]: Accepted publickey for core from 139.178.68.195 port 60590 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:26.394730 sshd[8327]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:26.399830 systemd-logind[1493]: New session 77 of user core. Oct 9 02:59:26.404610 systemd[1]: Started session-77.scope - Session 77 of User core. Oct 9 02:59:27.168340 sshd[8327]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:27.173384 systemd-logind[1493]: Session 77 logged out. Waiting for processes to exit. Oct 9 02:59:27.174230 systemd[1]: sshd@78-188.245.48.63:22-139.178.68.195:60590.service: Deactivated successfully. Oct 9 02:59:27.176403 systemd[1]: session-77.scope: Deactivated successfully. Oct 9 02:59:27.177699 systemd-logind[1493]: Removed session 77. Oct 9 02:59:32.347872 systemd[1]: Started sshd@79-188.245.48.63:22-139.178.68.195:43820.service - OpenSSH per-connection server daemon (139.178.68.195:43820). Oct 9 02:59:33.355634 sshd[8342]: Accepted publickey for core from 139.178.68.195 port 43820 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:33.357263 sshd[8342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:33.361964 systemd-logind[1493]: New session 78 of user core. Oct 9 02:59:33.368827 systemd[1]: Started session-78.scope - Session 78 of User core. Oct 9 02:59:34.110363 sshd[8342]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:34.113834 systemd-logind[1493]: Session 78 logged out. Waiting for processes to exit. Oct 9 02:59:34.114711 systemd[1]: sshd@79-188.245.48.63:22-139.178.68.195:43820.service: Deactivated successfully. Oct 9 02:59:34.116542 systemd[1]: session-78.scope: Deactivated successfully. Oct 9 02:59:34.118062 systemd-logind[1493]: Removed session 78. Oct 9 02:59:39.288806 systemd[1]: Started sshd@80-188.245.48.63:22-139.178.68.195:43834.service - OpenSSH per-connection server daemon (139.178.68.195:43834). Oct 9 02:59:40.332556 sshd[8379]: Accepted publickey for core from 139.178.68.195 port 43834 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:40.338592 sshd[8379]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:40.347280 systemd-logind[1493]: New session 79 of user core. Oct 9 02:59:40.359641 systemd[1]: Started session-79.scope - Session 79 of User core. Oct 9 02:59:41.371049 sshd[8379]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:41.375117 systemd[1]: sshd@80-188.245.48.63:22-139.178.68.195:43834.service: Deactivated successfully. Oct 9 02:59:41.377878 systemd[1]: session-79.scope: Deactivated successfully. Oct 9 02:59:41.378629 systemd-logind[1493]: Session 79 logged out. Waiting for processes to exit. Oct 9 02:59:41.379612 systemd-logind[1493]: Removed session 79. Oct 9 02:59:46.548808 systemd[1]: Started sshd@81-188.245.48.63:22-139.178.68.195:53996.service - OpenSSH per-connection server daemon (139.178.68.195:53996). Oct 9 02:59:47.551113 sshd[8418]: Accepted publickey for core from 139.178.68.195 port 53996 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:47.553017 sshd[8418]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:47.557965 systemd-logind[1493]: New session 80 of user core. Oct 9 02:59:47.562576 systemd[1]: Started session-80.scope - Session 80 of User core. Oct 9 02:59:48.328613 sshd[8418]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:48.333896 systemd[1]: sshd@81-188.245.48.63:22-139.178.68.195:53996.service: Deactivated successfully. Oct 9 02:59:48.337123 systemd[1]: session-80.scope: Deactivated successfully. Oct 9 02:59:48.338111 systemd-logind[1493]: Session 80 logged out. Waiting for processes to exit. Oct 9 02:59:48.339605 systemd-logind[1493]: Removed session 80. Oct 9 02:59:48.503024 systemd[1]: Started sshd@82-188.245.48.63:22-139.178.68.195:54010.service - OpenSSH per-connection server daemon (139.178.68.195:54010). Oct 9 02:59:49.507842 sshd[8454]: Accepted publickey for core from 139.178.68.195 port 54010 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:49.509557 sshd[8454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:49.514806 systemd-logind[1493]: New session 81 of user core. Oct 9 02:59:49.520600 systemd[1]: Started session-81.scope - Session 81 of User core. Oct 9 02:59:50.556483 sshd[8454]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:50.560506 systemd[1]: sshd@82-188.245.48.63:22-139.178.68.195:54010.service: Deactivated successfully. Oct 9 02:59:50.563021 systemd[1]: session-81.scope: Deactivated successfully. Oct 9 02:59:50.563906 systemd-logind[1493]: Session 81 logged out. Waiting for processes to exit. Oct 9 02:59:50.565147 systemd-logind[1493]: Removed session 81. Oct 9 02:59:50.738906 systemd[1]: Started sshd@83-188.245.48.63:22-139.178.68.195:54026.service - OpenSSH per-connection server daemon (139.178.68.195:54026). Oct 9 02:59:51.784573 sshd[8465]: Accepted publickey for core from 139.178.68.195 port 54026 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:51.786381 sshd[8465]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:51.791525 systemd-logind[1493]: New session 82 of user core. Oct 9 02:59:51.797596 systemd[1]: Started session-82.scope - Session 82 of User core. Oct 9 02:59:54.285822 sshd[8465]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:54.293709 systemd[1]: sshd@83-188.245.48.63:22-139.178.68.195:54026.service: Deactivated successfully. Oct 9 02:59:54.296062 systemd[1]: session-82.scope: Deactivated successfully. Oct 9 02:59:54.298599 systemd-logind[1493]: Session 82 logged out. Waiting for processes to exit. Oct 9 02:59:54.300752 systemd-logind[1493]: Removed session 82. Oct 9 02:59:54.454656 systemd[1]: Started sshd@84-188.245.48.63:22-139.178.68.195:48272.service - OpenSSH per-connection server daemon (139.178.68.195:48272). Oct 9 02:59:55.467220 sshd[8486]: Accepted publickey for core from 139.178.68.195 port 48272 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:55.470412 sshd[8486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:55.475380 systemd-logind[1493]: New session 83 of user core. Oct 9 02:59:55.482597 systemd[1]: Started session-83.scope - Session 83 of User core. Oct 9 02:59:56.656688 sshd[8486]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:56.660672 systemd[1]: sshd@84-188.245.48.63:22-139.178.68.195:48272.service: Deactivated successfully. Oct 9 02:59:56.662817 systemd[1]: session-83.scope: Deactivated successfully. Oct 9 02:59:56.663727 systemd-logind[1493]: Session 83 logged out. Waiting for processes to exit. Oct 9 02:59:56.664854 systemd-logind[1493]: Removed session 83. Oct 9 02:59:56.828718 systemd[1]: Started sshd@85-188.245.48.63:22-139.178.68.195:48280.service - OpenSSH per-connection server daemon (139.178.68.195:48280). Oct 9 02:59:57.824074 sshd[8504]: Accepted publickey for core from 139.178.68.195 port 48280 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 02:59:57.826417 sshd[8504]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 02:59:57.832561 systemd-logind[1493]: New session 84 of user core. Oct 9 02:59:57.840699 systemd[1]: Started session-84.scope - Session 84 of User core. Oct 9 02:59:58.581640 sshd[8504]: pam_unix(sshd:session): session closed for user core Oct 9 02:59:58.585577 systemd-logind[1493]: Session 84 logged out. Waiting for processes to exit. Oct 9 02:59:58.586372 systemd[1]: sshd@85-188.245.48.63:22-139.178.68.195:48280.service: Deactivated successfully. Oct 9 02:59:58.589083 systemd[1]: session-84.scope: Deactivated successfully. Oct 9 02:59:58.590170 systemd-logind[1493]: Removed session 84. Oct 9 03:00:03.757714 systemd[1]: Started sshd@86-188.245.48.63:22-139.178.68.195:53040.service - OpenSSH per-connection server daemon (139.178.68.195:53040). Oct 9 03:00:04.745460 sshd[8517]: Accepted publickey for core from 139.178.68.195 port 53040 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:04.747221 sshd[8517]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:04.752230 systemd-logind[1493]: New session 85 of user core. Oct 9 03:00:04.758598 systemd[1]: Started session-85.scope - Session 85 of User core. Oct 9 03:00:05.500395 sshd[8517]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:05.504727 systemd[1]: sshd@86-188.245.48.63:22-139.178.68.195:53040.service: Deactivated successfully. Oct 9 03:00:05.507370 systemd[1]: session-85.scope: Deactivated successfully. Oct 9 03:00:05.508240 systemd-logind[1493]: Session 85 logged out. Waiting for processes to exit. Oct 9 03:00:05.509398 systemd-logind[1493]: Removed session 85. Oct 9 03:00:10.671567 systemd[1]: Started sshd@87-188.245.48.63:22-139.178.68.195:53056.service - OpenSSH per-connection server daemon (139.178.68.195:53056). Oct 9 03:00:11.667392 sshd[8547]: Accepted publickey for core from 139.178.68.195 port 53056 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:11.669135 sshd[8547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:11.673877 systemd-logind[1493]: New session 86 of user core. Oct 9 03:00:11.678570 systemd[1]: Started session-86.scope - Session 86 of User core. Oct 9 03:00:12.488355 sshd[8547]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:12.492236 systemd-logind[1493]: Session 86 logged out. Waiting for processes to exit. Oct 9 03:00:12.493262 systemd[1]: sshd@87-188.245.48.63:22-139.178.68.195:53056.service: Deactivated successfully. Oct 9 03:00:12.496014 systemd[1]: session-86.scope: Deactivated successfully. Oct 9 03:00:12.497211 systemd-logind[1493]: Removed session 86. Oct 9 03:00:13.533070 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.dUURna.mount: Deactivated successfully. Oct 9 03:00:17.667769 systemd[1]: Started sshd@88-188.245.48.63:22-139.178.68.195:52252.service - OpenSSH per-connection server daemon (139.178.68.195:52252). Oct 9 03:00:18.680865 sshd[8606]: Accepted publickey for core from 139.178.68.195 port 52252 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:18.683475 sshd[8606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:18.688378 systemd-logind[1493]: New session 87 of user core. Oct 9 03:00:18.697585 systemd[1]: Started session-87.scope - Session 87 of User core. Oct 9 03:00:19.432419 sshd[8606]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:19.436609 systemd-logind[1493]: Session 87 logged out. Waiting for processes to exit. Oct 9 03:00:19.437765 systemd[1]: sshd@88-188.245.48.63:22-139.178.68.195:52252.service: Deactivated successfully. Oct 9 03:00:19.439954 systemd[1]: session-87.scope: Deactivated successfully. Oct 9 03:00:19.440996 systemd-logind[1493]: Removed session 87. Oct 9 03:00:24.605622 systemd[1]: Started sshd@89-188.245.48.63:22-139.178.68.195:37532.service - OpenSSH per-connection server daemon (139.178.68.195:37532). Oct 9 03:00:25.604586 sshd[8619]: Accepted publickey for core from 139.178.68.195 port 37532 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:25.606581 sshd[8619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:25.611684 systemd-logind[1493]: New session 88 of user core. Oct 9 03:00:25.616642 systemd[1]: Started session-88.scope - Session 88 of User core. Oct 9 03:00:26.360944 sshd[8619]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:26.364419 systemd[1]: sshd@89-188.245.48.63:22-139.178.68.195:37532.service: Deactivated successfully. Oct 9 03:00:26.366887 systemd[1]: session-88.scope: Deactivated successfully. Oct 9 03:00:26.369120 systemd-logind[1493]: Session 88 logged out. Waiting for processes to exit. Oct 9 03:00:26.370612 systemd-logind[1493]: Removed session 88. Oct 9 03:00:31.535693 systemd[1]: Started sshd@90-188.245.48.63:22-139.178.68.195:37726.service - OpenSSH per-connection server daemon (139.178.68.195:37726). Oct 9 03:00:32.525818 sshd[8639]: Accepted publickey for core from 139.178.68.195 port 37726 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:32.527708 sshd[8639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:32.532295 systemd-logind[1493]: New session 89 of user core. Oct 9 03:00:32.536596 systemd[1]: Started session-89.scope - Session 89 of User core. Oct 9 03:00:33.297816 sshd[8639]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:33.306421 systemd[1]: sshd@90-188.245.48.63:22-139.178.68.195:37726.service: Deactivated successfully. Oct 9 03:00:33.310994 systemd[1]: session-89.scope: Deactivated successfully. Oct 9 03:00:33.312864 systemd-logind[1493]: Session 89 logged out. Waiting for processes to exit. Oct 9 03:00:33.315080 systemd-logind[1493]: Removed session 89. Oct 9 03:00:35.362590 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.qwK0sd.mount: Deactivated successfully. Oct 9 03:00:38.482349 systemd[1]: Started sshd@91-188.245.48.63:22-139.178.68.195:37742.service - OpenSSH per-connection server daemon (139.178.68.195:37742). Oct 9 03:00:39.508929 sshd[8676]: Accepted publickey for core from 139.178.68.195 port 37742 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:39.513753 sshd[8676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:39.522775 systemd-logind[1493]: New session 90 of user core. Oct 9 03:00:39.528678 systemd[1]: Started session-90.scope - Session 90 of User core. Oct 9 03:00:40.327953 sshd[8676]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:40.332324 systemd[1]: sshd@91-188.245.48.63:22-139.178.68.195:37742.service: Deactivated successfully. Oct 9 03:00:40.334988 systemd[1]: session-90.scope: Deactivated successfully. Oct 9 03:00:40.336031 systemd-logind[1493]: Session 90 logged out. Waiting for processes to exit. Oct 9 03:00:40.338489 systemd-logind[1493]: Removed session 90. Oct 9 03:00:45.505981 systemd[1]: Started sshd@92-188.245.48.63:22-139.178.68.195:49942.service - OpenSSH per-connection server daemon (139.178.68.195:49942). Oct 9 03:00:46.526367 sshd[8710]: Accepted publickey for core from 139.178.68.195 port 49942 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:46.528322 sshd[8710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:46.534370 systemd-logind[1493]: New session 91 of user core. Oct 9 03:00:46.539953 systemd[1]: Started session-91.scope - Session 91 of User core. Oct 9 03:00:47.247573 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.ItIMNE.mount: Deactivated successfully. Oct 9 03:00:47.297424 sshd[8710]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:47.303024 systemd-logind[1493]: Session 91 logged out. Waiting for processes to exit. Oct 9 03:00:47.303375 systemd[1]: sshd@92-188.245.48.63:22-139.178.68.195:49942.service: Deactivated successfully. Oct 9 03:00:47.306139 systemd[1]: session-91.scope: Deactivated successfully. Oct 9 03:00:47.307598 systemd-logind[1493]: Removed session 91. Oct 9 03:00:52.473858 systemd[1]: Started sshd@93-188.245.48.63:22-139.178.68.195:40008.service - OpenSSH per-connection server daemon (139.178.68.195:40008). Oct 9 03:00:53.493159 sshd[8750]: Accepted publickey for core from 139.178.68.195 port 40008 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:00:53.497757 sshd[8750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:00:53.507719 systemd-logind[1493]: New session 92 of user core. Oct 9 03:00:53.513750 systemd[1]: Started session-92.scope - Session 92 of User core. Oct 9 03:00:54.345002 sshd[8750]: pam_unix(sshd:session): session closed for user core Oct 9 03:00:54.352597 systemd[1]: sshd@93-188.245.48.63:22-139.178.68.195:40008.service: Deactivated successfully. Oct 9 03:00:54.357776 systemd[1]: session-92.scope: Deactivated successfully. Oct 9 03:00:54.359122 systemd-logind[1493]: Session 92 logged out. Waiting for processes to exit. Oct 9 03:00:54.360884 systemd-logind[1493]: Removed session 92. Oct 9 03:00:59.516175 systemd[1]: Started sshd@94-188.245.48.63:22-139.178.68.195:40020.service - OpenSSH per-connection server daemon (139.178.68.195:40020). Oct 9 03:01:00.505257 sshd[8770]: Accepted publickey for core from 139.178.68.195 port 40020 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:00.507107 sshd[8770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:00.511620 systemd-logind[1493]: New session 93 of user core. Oct 9 03:01:00.516629 systemd[1]: Started session-93.scope - Session 93 of User core. Oct 9 03:01:01.256695 sshd[8770]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:01.260869 systemd[1]: sshd@94-188.245.48.63:22-139.178.68.195:40020.service: Deactivated successfully. Oct 9 03:01:01.263375 systemd[1]: session-93.scope: Deactivated successfully. Oct 9 03:01:01.264644 systemd-logind[1493]: Session 93 logged out. Waiting for processes to exit. Oct 9 03:01:01.266203 systemd-logind[1493]: Removed session 93. Oct 9 03:01:06.436223 systemd[1]: Started sshd@95-188.245.48.63:22-139.178.68.195:47776.service - OpenSSH per-connection server daemon (139.178.68.195:47776). Oct 9 03:01:07.456961 sshd[8789]: Accepted publickey for core from 139.178.68.195 port 47776 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:07.459580 sshd[8789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:07.466155 systemd-logind[1493]: New session 94 of user core. Oct 9 03:01:07.473344 systemd[1]: Started session-94.scope - Session 94 of User core. Oct 9 03:01:08.228237 sshd[8789]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:08.233308 systemd[1]: sshd@95-188.245.48.63:22-139.178.68.195:47776.service: Deactivated successfully. Oct 9 03:01:08.235760 systemd[1]: session-94.scope: Deactivated successfully. Oct 9 03:01:08.236532 systemd-logind[1493]: Session 94 logged out. Waiting for processes to exit. Oct 9 03:01:08.237633 systemd-logind[1493]: Removed session 94. Oct 9 03:01:13.418647 systemd[1]: Started sshd@96-188.245.48.63:22-139.178.68.195:57612.service - OpenSSH per-connection server daemon (139.178.68.195:57612). Oct 9 03:01:13.563493 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.Wfkh9L.mount: Deactivated successfully. Oct 9 03:01:14.488114 sshd[8809]: Accepted publickey for core from 139.178.68.195 port 57612 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:14.490720 sshd[8809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:14.495602 systemd-logind[1493]: New session 95 of user core. Oct 9 03:01:14.499590 systemd[1]: Started session-95.scope - Session 95 of User core. Oct 9 03:01:15.375087 sshd[8809]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:15.380358 systemd[1]: sshd@96-188.245.48.63:22-139.178.68.195:57612.service: Deactivated successfully. Oct 9 03:01:15.382917 systemd[1]: session-95.scope: Deactivated successfully. Oct 9 03:01:15.383941 systemd-logind[1493]: Session 95 logged out. Waiting for processes to exit. Oct 9 03:01:15.385471 systemd-logind[1493]: Removed session 95. Oct 9 03:01:20.572873 systemd[1]: Started sshd@97-188.245.48.63:22-139.178.68.195:57626.service - OpenSSH per-connection server daemon (139.178.68.195:57626). Oct 9 03:01:21.670582 sshd[8870]: Accepted publickey for core from 139.178.68.195 port 57626 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:21.674480 sshd[8870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:21.685678 systemd-logind[1493]: New session 96 of user core. Oct 9 03:01:21.687675 systemd[1]: Started session-96.scope - Session 96 of User core. Oct 9 03:01:22.562192 sshd[8870]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:22.566544 systemd[1]: sshd@97-188.245.48.63:22-139.178.68.195:57626.service: Deactivated successfully. Oct 9 03:01:22.568780 systemd[1]: session-96.scope: Deactivated successfully. Oct 9 03:01:22.569681 systemd-logind[1493]: Session 96 logged out. Waiting for processes to exit. Oct 9 03:01:22.570756 systemd-logind[1493]: Removed session 96. Oct 9 03:01:27.750972 systemd[1]: Started sshd@98-188.245.48.63:22-139.178.68.195:51506.service - OpenSSH per-connection server daemon (139.178.68.195:51506). Oct 9 03:01:28.774364 sshd[8887]: Accepted publickey for core from 139.178.68.195 port 51506 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:28.775974 sshd[8887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:28.780269 systemd-logind[1493]: New session 97 of user core. Oct 9 03:01:28.784586 systemd[1]: Started session-97.scope - Session 97 of User core. Oct 9 03:01:29.582404 sshd[8887]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:29.586117 systemd[1]: sshd@98-188.245.48.63:22-139.178.68.195:51506.service: Deactivated successfully. Oct 9 03:01:29.588297 systemd[1]: session-97.scope: Deactivated successfully. Oct 9 03:01:29.589831 systemd-logind[1493]: Session 97 logged out. Waiting for processes to exit. Oct 9 03:01:29.591085 systemd-logind[1493]: Removed session 97. Oct 9 03:01:34.766023 systemd[1]: Started sshd@99-188.245.48.63:22-139.178.68.195:58218.service - OpenSSH per-connection server daemon (139.178.68.195:58218). Oct 9 03:01:35.808490 sshd[8908]: Accepted publickey for core from 139.178.68.195 port 58218 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:35.810420 sshd[8908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:35.815861 systemd-logind[1493]: New session 98 of user core. Oct 9 03:01:35.821595 systemd[1]: Started session-98.scope - Session 98 of User core. Oct 9 03:01:36.582978 sshd[8908]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:36.586168 systemd[1]: sshd@99-188.245.48.63:22-139.178.68.195:58218.service: Deactivated successfully. Oct 9 03:01:36.588386 systemd[1]: session-98.scope: Deactivated successfully. Oct 9 03:01:36.590033 systemd-logind[1493]: Session 98 logged out. Waiting for processes to exit. Oct 9 03:01:36.591896 systemd-logind[1493]: Removed session 98. Oct 9 03:01:41.767626 systemd[1]: Started sshd@100-188.245.48.63:22-139.178.68.195:56806.service - OpenSSH per-connection server daemon (139.178.68.195:56806). Oct 9 03:01:42.833738 sshd[8960]: Accepted publickey for core from 139.178.68.195 port 56806 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:42.835805 sshd[8960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:42.841330 systemd-logind[1493]: New session 99 of user core. Oct 9 03:01:42.845597 systemd[1]: Started session-99.scope - Session 99 of User core. Oct 9 03:01:43.529855 systemd[1]: run-containerd-runc-k8s.io-a36b7f3847bc93583798b759d61f90ffd36e1d3a27801d1d1af1e058a8ec6ce9-runc.HkcBUC.mount: Deactivated successfully. Oct 9 03:01:43.598084 sshd[8960]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:43.602427 systemd[1]: sshd@100-188.245.48.63:22-139.178.68.195:56806.service: Deactivated successfully. Oct 9 03:01:43.604641 systemd[1]: session-99.scope: Deactivated successfully. Oct 9 03:01:43.605406 systemd-logind[1493]: Session 99 logged out. Waiting for processes to exit. Oct 9 03:01:43.606934 systemd-logind[1493]: Removed session 99. Oct 9 03:01:48.785815 systemd[1]: Started sshd@101-188.245.48.63:22-139.178.68.195:56816.service - OpenSSH per-connection server daemon (139.178.68.195:56816). Oct 9 03:01:49.875689 sshd[9015]: Accepted publickey for core from 139.178.68.195 port 56816 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:49.877807 sshd[9015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:49.882529 systemd-logind[1493]: New session 100 of user core. Oct 9 03:01:49.889620 systemd[1]: Started session-100.scope - Session 100 of User core. Oct 9 03:01:50.687890 sshd[9015]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:50.691635 systemd-logind[1493]: Session 100 logged out. Waiting for processes to exit. Oct 9 03:01:50.692177 systemd[1]: sshd@101-188.245.48.63:22-139.178.68.195:56816.service: Deactivated successfully. Oct 9 03:01:50.694309 systemd[1]: session-100.scope: Deactivated successfully. Oct 9 03:01:50.695631 systemd-logind[1493]: Removed session 100. Oct 9 03:01:55.875764 systemd[1]: Started sshd@102-188.245.48.63:22-139.178.68.195:35764.service - OpenSSH per-connection server daemon (139.178.68.195:35764). Oct 9 03:01:56.913858 sshd[9036]: Accepted publickey for core from 139.178.68.195 port 35764 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:01:56.915812 sshd[9036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:01:56.921065 systemd-logind[1493]: New session 101 of user core. Oct 9 03:01:56.926831 systemd[1]: Started session-101.scope - Session 101 of User core. Oct 9 03:01:57.690663 sshd[9036]: pam_unix(sshd:session): session closed for user core Oct 9 03:01:57.695036 systemd[1]: sshd@102-188.245.48.63:22-139.178.68.195:35764.service: Deactivated successfully. Oct 9 03:01:57.697010 systemd[1]: session-101.scope: Deactivated successfully. Oct 9 03:01:57.697749 systemd-logind[1493]: Session 101 logged out. Waiting for processes to exit. Oct 9 03:01:57.699018 systemd-logind[1493]: Removed session 101. Oct 9 03:02:02.877692 systemd[1]: Started sshd@103-188.245.48.63:22-139.178.68.195:43102.service - OpenSSH per-connection server daemon (139.178.68.195:43102). Oct 9 03:02:03.931726 sshd[9054]: Accepted publickey for core from 139.178.68.195 port 43102 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:03.933919 sshd[9054]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:03.939896 systemd-logind[1493]: New session 102 of user core. Oct 9 03:02:03.946572 systemd[1]: Started session-102.scope - Session 102 of User core. Oct 9 03:02:04.741988 sshd[9054]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:04.746108 systemd[1]: sshd@103-188.245.48.63:22-139.178.68.195:43102.service: Deactivated successfully. Oct 9 03:02:04.748490 systemd[1]: session-102.scope: Deactivated successfully. Oct 9 03:02:04.749344 systemd-logind[1493]: Session 102 logged out. Waiting for processes to exit. Oct 9 03:02:04.751204 systemd-logind[1493]: Removed session 102. Oct 9 03:02:09.917986 systemd[1]: Started sshd@104-188.245.48.63:22-139.178.68.195:43108.service - OpenSSH per-connection server daemon (139.178.68.195:43108). Oct 9 03:02:10.940225 sshd[9067]: Accepted publickey for core from 139.178.68.195 port 43108 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:10.942261 sshd[9067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:10.946880 systemd-logind[1493]: New session 103 of user core. Oct 9 03:02:10.952580 systemd[1]: Started session-103.scope - Session 103 of User core. Oct 9 03:02:11.741054 sshd[9067]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:11.745636 systemd[1]: sshd@104-188.245.48.63:22-139.178.68.195:43108.service: Deactivated successfully. Oct 9 03:02:11.749522 systemd[1]: session-103.scope: Deactivated successfully. Oct 9 03:02:11.751895 systemd-logind[1493]: Session 103 logged out. Waiting for processes to exit. Oct 9 03:02:11.753584 systemd-logind[1493]: Removed session 103. Oct 9 03:02:16.920240 systemd[1]: Started sshd@105-188.245.48.63:22-139.178.68.195:54192.service - OpenSSH per-connection server daemon (139.178.68.195:54192). Oct 9 03:02:17.275333 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.21HJP8.mount: Deactivated successfully. Oct 9 03:02:17.978986 sshd[9104]: Accepted publickey for core from 139.178.68.195 port 54192 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:17.981745 sshd[9104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:17.987177 systemd-logind[1493]: New session 104 of user core. Oct 9 03:02:17.991591 systemd[1]: Started session-104.scope - Session 104 of User core. Oct 9 03:02:18.816527 sshd[9104]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:18.820163 systemd[1]: sshd@105-188.245.48.63:22-139.178.68.195:54192.service: Deactivated successfully. Oct 9 03:02:18.822300 systemd[1]: session-104.scope: Deactivated successfully. Oct 9 03:02:18.825087 systemd-logind[1493]: Session 104 logged out. Waiting for processes to exit. Oct 9 03:02:18.826518 systemd-logind[1493]: Removed session 104. Oct 9 03:02:24.008730 systemd[1]: Started sshd@106-188.245.48.63:22-139.178.68.195:33668.service - OpenSSH per-connection server daemon (139.178.68.195:33668). Oct 9 03:02:25.091247 sshd[9145]: Accepted publickey for core from 139.178.68.195 port 33668 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:25.092924 sshd[9145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:25.097065 systemd-logind[1493]: New session 105 of user core. Oct 9 03:02:25.107612 systemd[1]: Started session-105.scope - Session 105 of User core. Oct 9 03:02:25.928534 sshd[9145]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:25.933412 systemd[1]: sshd@106-188.245.48.63:22-139.178.68.195:33668.service: Deactivated successfully. Oct 9 03:02:25.935888 systemd[1]: session-105.scope: Deactivated successfully. Oct 9 03:02:25.936927 systemd-logind[1493]: Session 105 logged out. Waiting for processes to exit. Oct 9 03:02:25.938262 systemd-logind[1493]: Removed session 105. Oct 9 03:02:31.118117 systemd[1]: Started sshd@107-188.245.48.63:22-139.178.68.195:53874.service - OpenSSH per-connection server daemon (139.178.68.195:53874). Oct 9 03:02:32.225532 sshd[9160]: Accepted publickey for core from 139.178.68.195 port 53874 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:32.227819 sshd[9160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:32.234261 systemd-logind[1493]: New session 106 of user core. Oct 9 03:02:32.238591 systemd[1]: Started session-106.scope - Session 106 of User core. Oct 9 03:02:33.066756 sshd[9160]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:33.071366 systemd[1]: sshd@107-188.245.48.63:22-139.178.68.195:53874.service: Deactivated successfully. Oct 9 03:02:33.075136 systemd[1]: session-106.scope: Deactivated successfully. Oct 9 03:02:33.075930 systemd-logind[1493]: Session 106 logged out. Waiting for processes to exit. Oct 9 03:02:33.077040 systemd-logind[1493]: Removed session 106. Oct 9 03:02:38.236306 systemd[1]: Started sshd@108-188.245.48.63:22-139.178.68.195:53886.service - OpenSSH per-connection server daemon (139.178.68.195:53886). Oct 9 03:02:39.232127 sshd[9198]: Accepted publickey for core from 139.178.68.195 port 53886 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:39.234360 sshd[9198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:39.239547 systemd-logind[1493]: New session 107 of user core. Oct 9 03:02:39.245604 systemd[1]: Started session-107.scope - Session 107 of User core. Oct 9 03:02:39.978762 sshd[9198]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:39.982601 systemd-logind[1493]: Session 107 logged out. Waiting for processes to exit. Oct 9 03:02:39.983614 systemd[1]: sshd@108-188.245.48.63:22-139.178.68.195:53886.service: Deactivated successfully. Oct 9 03:02:39.985750 systemd[1]: session-107.scope: Deactivated successfully. Oct 9 03:02:39.986686 systemd-logind[1493]: Removed session 107. Oct 9 03:02:45.159701 systemd[1]: Started sshd@109-188.245.48.63:22-139.178.68.195:53824.service - OpenSSH per-connection server daemon (139.178.68.195:53824). Oct 9 03:02:46.207261 sshd[9238]: Accepted publickey for core from 139.178.68.195 port 53824 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:46.208947 sshd[9238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:46.213515 systemd-logind[1493]: New session 108 of user core. Oct 9 03:02:46.217589 systemd[1]: Started session-108.scope - Session 108 of User core. Oct 9 03:02:47.043350 sshd[9238]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:47.049972 systemd[1]: sshd@109-188.245.48.63:22-139.178.68.195:53824.service: Deactivated successfully. Oct 9 03:02:47.054708 systemd[1]: session-108.scope: Deactivated successfully. Oct 9 03:02:47.056417 systemd-logind[1493]: Session 108 logged out. Waiting for processes to exit. Oct 9 03:02:47.059272 systemd-logind[1493]: Removed session 108. Oct 9 03:02:52.242754 systemd[1]: Started sshd@110-188.245.48.63:22-139.178.68.195:45216.service - OpenSSH per-connection server daemon (139.178.68.195:45216). Oct 9 03:02:53.270153 sshd[9273]: Accepted publickey for core from 139.178.68.195 port 45216 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:02:53.273104 sshd[9273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:02:53.280185 systemd-logind[1493]: New session 109 of user core. Oct 9 03:02:53.284590 systemd[1]: Started session-109.scope - Session 109 of User core. Oct 9 03:02:54.035940 sshd[9273]: pam_unix(sshd:session): session closed for user core Oct 9 03:02:54.040239 systemd[1]: sshd@110-188.245.48.63:22-139.178.68.195:45216.service: Deactivated successfully. Oct 9 03:02:54.043082 systemd[1]: session-109.scope: Deactivated successfully. Oct 9 03:02:54.043843 systemd-logind[1493]: Session 109 logged out. Waiting for processes to exit. Oct 9 03:02:54.045004 systemd-logind[1493]: Removed session 109. Oct 9 03:02:59.229684 systemd[1]: Started sshd@111-188.245.48.63:22-139.178.68.195:45218.service - OpenSSH per-connection server daemon (139.178.68.195:45218). Oct 9 03:02:59.665769 systemd[1]: Started sshd@112-188.245.48.63:22-194.169.175.38:17352.service - OpenSSH per-connection server daemon (194.169.175.38:17352). Oct 9 03:03:00.227591 sshd[9296]: Invalid user user from 194.169.175.38 port 17352 Oct 9 03:03:00.264696 sshd[9296]: Connection closed by invalid user user 194.169.175.38 port 17352 [preauth] Oct 9 03:03:00.267387 systemd[1]: sshd@112-188.245.48.63:22-194.169.175.38:17352.service: Deactivated successfully. Oct 9 03:03:00.335713 sshd[9293]: Accepted publickey for core from 139.178.68.195 port 45218 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:00.337478 sshd[9293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:00.342716 systemd-logind[1493]: New session 110 of user core. Oct 9 03:03:00.350611 systemd[1]: Started session-110.scope - Session 110 of User core. Oct 9 03:03:01.140146 sshd[9293]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:01.144131 systemd[1]: sshd@111-188.245.48.63:22-139.178.68.195:45218.service: Deactivated successfully. Oct 9 03:03:01.147065 systemd[1]: session-110.scope: Deactivated successfully. Oct 9 03:03:01.147923 systemd-logind[1493]: Session 110 logged out. Waiting for processes to exit. Oct 9 03:03:01.149041 systemd-logind[1493]: Removed session 110. Oct 9 03:03:06.334825 systemd[1]: Started sshd@113-188.245.48.63:22-139.178.68.195:50206.service - OpenSSH per-connection server daemon (139.178.68.195:50206). Oct 9 03:03:07.438132 sshd[9317]: Accepted publickey for core from 139.178.68.195 port 50206 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:07.440645 sshd[9317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:07.447916 systemd-logind[1493]: New session 111 of user core. Oct 9 03:03:07.451614 systemd[1]: Started session-111.scope - Session 111 of User core. Oct 9 03:03:08.241654 sshd[9317]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:08.247321 systemd-logind[1493]: Session 111 logged out. Waiting for processes to exit. Oct 9 03:03:08.248799 systemd[1]: sshd@113-188.245.48.63:22-139.178.68.195:50206.service: Deactivated successfully. Oct 9 03:03:08.252270 systemd[1]: session-111.scope: Deactivated successfully. Oct 9 03:03:08.253974 systemd-logind[1493]: Removed session 111. Oct 9 03:03:13.427849 systemd[1]: Started sshd@114-188.245.48.63:22-139.178.68.195:47730.service - OpenSSH per-connection server daemon (139.178.68.195:47730). Oct 9 03:03:14.468403 sshd[9342]: Accepted publickey for core from 139.178.68.195 port 47730 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:14.470206 sshd[9342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:14.474654 systemd-logind[1493]: New session 112 of user core. Oct 9 03:03:14.481596 systemd[1]: Started session-112.scope - Session 112 of User core. Oct 9 03:03:15.217627 sshd[9342]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:15.221782 systemd[1]: sshd@114-188.245.48.63:22-139.178.68.195:47730.service: Deactivated successfully. Oct 9 03:03:15.224199 systemd[1]: session-112.scope: Deactivated successfully. Oct 9 03:03:15.225555 systemd-logind[1493]: Session 112 logged out. Waiting for processes to exit. Oct 9 03:03:15.226761 systemd-logind[1493]: Removed session 112. Oct 9 03:03:20.396916 systemd[1]: Started sshd@115-188.245.48.63:22-139.178.68.195:47732.service - OpenSSH per-connection server daemon (139.178.68.195:47732). Oct 9 03:03:21.398244 sshd[9402]: Accepted publickey for core from 139.178.68.195 port 47732 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:21.400056 sshd[9402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:21.405013 systemd-logind[1493]: New session 113 of user core. Oct 9 03:03:21.409573 systemd[1]: Started session-113.scope - Session 113 of User core. Oct 9 03:03:22.151979 sshd[9402]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:22.157192 systemd[1]: sshd@115-188.245.48.63:22-139.178.68.195:47732.service: Deactivated successfully. Oct 9 03:03:22.159889 systemd[1]: session-113.scope: Deactivated successfully. Oct 9 03:03:22.160762 systemd-logind[1493]: Session 113 logged out. Waiting for processes to exit. Oct 9 03:03:22.162620 systemd-logind[1493]: Removed session 113. Oct 9 03:03:27.330726 systemd[1]: Started sshd@116-188.245.48.63:22-139.178.68.195:36986.service - OpenSSH per-connection server daemon (139.178.68.195:36986). Oct 9 03:03:28.339146 sshd[9422]: Accepted publickey for core from 139.178.68.195 port 36986 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:28.341021 sshd[9422]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:28.345907 systemd-logind[1493]: New session 114 of user core. Oct 9 03:03:28.351569 systemd[1]: Started session-114.scope - Session 114 of User core. Oct 9 03:03:29.083852 sshd[9422]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:29.087162 systemd[1]: sshd@116-188.245.48.63:22-139.178.68.195:36986.service: Deactivated successfully. Oct 9 03:03:29.089903 systemd[1]: session-114.scope: Deactivated successfully. Oct 9 03:03:29.091737 systemd-logind[1493]: Session 114 logged out. Waiting for processes to exit. Oct 9 03:03:29.093090 systemd-logind[1493]: Removed session 114. Oct 9 03:03:34.264969 systemd[1]: Started sshd@117-188.245.48.63:22-139.178.68.195:56812.service - OpenSSH per-connection server daemon (139.178.68.195:56812). Oct 9 03:03:35.270394 sshd[9435]: Accepted publickey for core from 139.178.68.195 port 56812 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:35.273567 sshd[9435]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:35.281125 systemd-logind[1493]: New session 115 of user core. Oct 9 03:03:35.288695 systemd[1]: Started session-115.scope - Session 115 of User core. Oct 9 03:03:36.036552 sshd[9435]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:36.040572 systemd[1]: sshd@117-188.245.48.63:22-139.178.68.195:56812.service: Deactivated successfully. Oct 9 03:03:36.042853 systemd[1]: session-115.scope: Deactivated successfully. Oct 9 03:03:36.043909 systemd-logind[1493]: Session 115 logged out. Waiting for processes to exit. Oct 9 03:03:36.045250 systemd-logind[1493]: Removed session 115. Oct 9 03:03:41.207561 systemd[1]: Started sshd@118-188.245.48.63:22-139.178.68.195:32878.service - OpenSSH per-connection server daemon (139.178.68.195:32878). Oct 9 03:03:42.203271 sshd[9473]: Accepted publickey for core from 139.178.68.195 port 32878 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:42.205287 sshd[9473]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:42.210100 systemd-logind[1493]: New session 116 of user core. Oct 9 03:03:42.215591 systemd[1]: Started session-116.scope - Session 116 of User core. Oct 9 03:03:42.950697 sshd[9473]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:42.955230 systemd[1]: sshd@118-188.245.48.63:22-139.178.68.195:32878.service: Deactivated successfully. Oct 9 03:03:42.957379 systemd[1]: session-116.scope: Deactivated successfully. Oct 9 03:03:42.958705 systemd-logind[1493]: Session 116 logged out. Waiting for processes to exit. Oct 9 03:03:42.959756 systemd-logind[1493]: Removed session 116. Oct 9 03:03:47.249517 systemd[1]: run-containerd-runc-k8s.io-b035779be570a2f49e0f11382586762b34939c9dcdd80c3e6056dcb539fdc29e-runc.PwV1yT.mount: Deactivated successfully. Oct 9 03:03:48.133800 systemd[1]: Started sshd@119-188.245.48.63:22-139.178.68.195:32888.service - OpenSSH per-connection server daemon (139.178.68.195:32888). Oct 9 03:03:49.180041 sshd[9534]: Accepted publickey for core from 139.178.68.195 port 32888 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:49.182839 sshd[9534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:49.188033 systemd-logind[1493]: New session 117 of user core. Oct 9 03:03:49.195565 systemd[1]: Started session-117.scope - Session 117 of User core. Oct 9 03:03:50.117948 sshd[9534]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:50.122352 systemd-logind[1493]: Session 117 logged out. Waiting for processes to exit. Oct 9 03:03:50.123296 systemd[1]: sshd@119-188.245.48.63:22-139.178.68.195:32888.service: Deactivated successfully. Oct 9 03:03:50.125924 systemd[1]: session-117.scope: Deactivated successfully. Oct 9 03:03:50.126939 systemd-logind[1493]: Removed session 117. Oct 9 03:03:55.297799 systemd[1]: Started sshd@120-188.245.48.63:22-139.178.68.195:53902.service - OpenSSH per-connection server daemon (139.178.68.195:53902). Oct 9 03:03:56.290820 sshd[9548]: Accepted publickey for core from 139.178.68.195 port 53902 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:03:56.292686 sshd[9548]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:03:56.297529 systemd-logind[1493]: New session 118 of user core. Oct 9 03:03:56.303600 systemd[1]: Started session-118.scope - Session 118 of User core. Oct 9 03:03:57.070909 sshd[9548]: pam_unix(sshd:session): session closed for user core Oct 9 03:03:57.074401 systemd[1]: sshd@120-188.245.48.63:22-139.178.68.195:53902.service: Deactivated successfully. Oct 9 03:03:57.076842 systemd[1]: session-118.scope: Deactivated successfully. Oct 9 03:03:57.077678 systemd-logind[1493]: Session 118 logged out. Waiting for processes to exit. Oct 9 03:03:57.078793 systemd-logind[1493]: Removed session 118. Oct 9 03:04:02.246309 systemd[1]: Started sshd@121-188.245.48.63:22-139.178.68.195:43868.service - OpenSSH per-connection server daemon (139.178.68.195:43868). Oct 9 03:04:03.268995 sshd[9568]: Accepted publickey for core from 139.178.68.195 port 43868 ssh2: RSA SHA256:v48Zw3NfUjWfVrcdWMmDtwgBD76YkG1RqExUPbRuvxw Oct 9 03:04:03.270910 sshd[9568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 9 03:04:03.276601 systemd-logind[1493]: New session 119 of user core. Oct 9 03:04:03.285588 systemd[1]: Started session-119.scope - Session 119 of User core. Oct 9 03:04:04.052757 sshd[9568]: pam_unix(sshd:session): session closed for user core Oct 9 03:04:04.058915 systemd[1]: sshd@121-188.245.48.63:22-139.178.68.195:43868.service: Deactivated successfully. Oct 9 03:04:04.063051 systemd[1]: session-119.scope: Deactivated successfully. Oct 9 03:04:04.064390 systemd-logind[1493]: Session 119 logged out. Waiting for processes to exit. Oct 9 03:04:04.066066 systemd-logind[1493]: Removed session 119.