Feb 13 08:16:42.544932 kernel: Linux version 5.15.148-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Feb 12 18:05:31 -00 2024 Feb 13 08:16:42.544945 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 08:16:42.544952 kernel: BIOS-provided physical RAM map: Feb 13 08:16:42.544956 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Feb 13 08:16:42.544959 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Feb 13 08:16:42.544963 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Feb 13 08:16:42.544967 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Feb 13 08:16:42.544971 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Feb 13 08:16:42.544975 kernel: BIOS-e820: [mem 0x0000000040400000-0x00000000820e1fff] usable Feb 13 08:16:42.544979 kernel: BIOS-e820: [mem 0x00000000820e2000-0x00000000820e2fff] ACPI NVS Feb 13 08:16:42.544983 kernel: BIOS-e820: [mem 0x00000000820e3000-0x00000000820e3fff] reserved Feb 13 08:16:42.544987 kernel: BIOS-e820: [mem 0x00000000820e4000-0x000000008afccfff] usable Feb 13 08:16:42.544991 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Feb 13 08:16:42.544995 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Feb 13 08:16:42.545000 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Feb 13 08:16:42.545005 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Feb 13 08:16:42.545009 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Feb 13 08:16:42.545013 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Feb 13 08:16:42.545018 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Feb 13 08:16:42.545022 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Feb 13 08:16:42.545026 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Feb 13 08:16:42.545030 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 08:16:42.545034 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Feb 13 08:16:42.545038 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Feb 13 08:16:42.545042 kernel: NX (Execute Disable) protection: active Feb 13 08:16:42.545046 kernel: SMBIOS 3.2.1 present. Feb 13 08:16:42.545051 kernel: DMI: Supermicro X11SCM-F/X11SCM-F, BIOS 1.9 09/16/2022 Feb 13 08:16:42.545055 kernel: tsc: Detected 3400.000 MHz processor Feb 13 08:16:42.545059 kernel: tsc: Detected 3399.906 MHz TSC Feb 13 08:16:42.545064 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 08:16:42.545068 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 08:16:42.545072 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Feb 13 08:16:42.545077 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 08:16:42.545081 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Feb 13 08:16:42.545085 kernel: Using GB pages for direct mapping Feb 13 08:16:42.545090 kernel: ACPI: Early table checksum verification disabled Feb 13 08:16:42.545095 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Feb 13 08:16:42.545099 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Feb 13 08:16:42.545103 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Feb 13 08:16:42.545108 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Feb 13 08:16:42.545114 kernel: ACPI: FACS 0x000000008C66CF80 000040 Feb 13 08:16:42.545118 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Feb 13 08:16:42.545124 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Feb 13 08:16:42.545128 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Feb 13 08:16:42.545133 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Feb 13 08:16:42.545138 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Feb 13 08:16:42.545142 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Feb 13 08:16:42.545147 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Feb 13 08:16:42.545152 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Feb 13 08:16:42.545156 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:16:42.545162 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Feb 13 08:16:42.545166 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Feb 13 08:16:42.545171 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:16:42.545175 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:16:42.545180 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Feb 13 08:16:42.545185 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Feb 13 08:16:42.545189 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:16:42.545194 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:16:42.545199 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Feb 13 08:16:42.545204 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Feb 13 08:16:42.545208 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Feb 13 08:16:42.545213 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Feb 13 08:16:42.545218 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Feb 13 08:16:42.545222 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Feb 13 08:16:42.545227 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Feb 13 08:16:42.545231 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Feb 13 08:16:42.545236 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Feb 13 08:16:42.545241 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Feb 13 08:16:42.545246 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Feb 13 08:16:42.545251 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Feb 13 08:16:42.545255 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Feb 13 08:16:42.545260 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Feb 13 08:16:42.545264 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Feb 13 08:16:42.545269 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Feb 13 08:16:42.545274 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Feb 13 08:16:42.545279 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Feb 13 08:16:42.545283 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Feb 13 08:16:42.545288 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Feb 13 08:16:42.545293 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Feb 13 08:16:42.545297 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Feb 13 08:16:42.545302 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Feb 13 08:16:42.545306 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Feb 13 08:16:42.545311 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Feb 13 08:16:42.545315 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Feb 13 08:16:42.545321 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Feb 13 08:16:42.545325 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Feb 13 08:16:42.545330 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Feb 13 08:16:42.545334 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Feb 13 08:16:42.545339 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Feb 13 08:16:42.545344 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Feb 13 08:16:42.545348 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Feb 13 08:16:42.545353 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Feb 13 08:16:42.545357 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Feb 13 08:16:42.545362 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Feb 13 08:16:42.545367 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Feb 13 08:16:42.545372 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Feb 13 08:16:42.545376 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Feb 13 08:16:42.545381 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Feb 13 08:16:42.545385 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Feb 13 08:16:42.545390 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Feb 13 08:16:42.545394 kernel: No NUMA configuration found Feb 13 08:16:42.545399 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Feb 13 08:16:42.545405 kernel: NODE_DATA(0) allocated [mem 0x86effa000-0x86effffff] Feb 13 08:16:42.545409 kernel: Zone ranges: Feb 13 08:16:42.545414 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 08:16:42.545418 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 13 08:16:42.545423 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Feb 13 08:16:42.545427 kernel: Movable zone start for each node Feb 13 08:16:42.545432 kernel: Early memory node ranges Feb 13 08:16:42.545437 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Feb 13 08:16:42.545441 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Feb 13 08:16:42.545446 kernel: node 0: [mem 0x0000000040400000-0x00000000820e1fff] Feb 13 08:16:42.545451 kernel: node 0: [mem 0x00000000820e4000-0x000000008afccfff] Feb 13 08:16:42.545456 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Feb 13 08:16:42.545460 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Feb 13 08:16:42.545465 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Feb 13 08:16:42.545472 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Feb 13 08:16:42.545477 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 08:16:42.545500 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Feb 13 08:16:42.545505 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Feb 13 08:16:42.545510 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Feb 13 08:16:42.545515 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Feb 13 08:16:42.545520 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Feb 13 08:16:42.545525 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Feb 13 08:16:42.545530 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Feb 13 08:16:42.545535 kernel: ACPI: PM-Timer IO Port: 0x1808 Feb 13 08:16:42.545540 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 08:16:42.545545 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 08:16:42.545550 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 08:16:42.545555 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 08:16:42.545560 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 08:16:42.545565 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 08:16:42.545570 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 08:16:42.545574 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 08:16:42.545579 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 08:16:42.545584 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 08:16:42.545589 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 08:16:42.545594 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 08:16:42.545599 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 08:16:42.545604 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 08:16:42.545609 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 08:16:42.545613 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 08:16:42.545618 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Feb 13 08:16:42.545623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 08:16:42.545628 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 08:16:42.545633 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 08:16:42.545638 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 08:16:42.545643 kernel: TSC deadline timer available Feb 13 08:16:42.545648 kernel: smpboot: Allowing 16 CPUs, 0 hotplug CPUs Feb 13 08:16:42.545653 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Feb 13 08:16:42.545658 kernel: Booting paravirtualized kernel on bare hardware Feb 13 08:16:42.545662 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 08:16:42.545667 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:16 nr_node_ids:1 Feb 13 08:16:42.545672 kernel: percpu: Embedded 55 pages/cpu s185624 r8192 d31464 u262144 Feb 13 08:16:42.545677 kernel: pcpu-alloc: s185624 r8192 d31464 u262144 alloc=1*2097152 Feb 13 08:16:42.545682 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Feb 13 08:16:42.545687 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8232415 Feb 13 08:16:42.545692 kernel: Policy zone: Normal Feb 13 08:16:42.545697 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 08:16:42.545703 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 08:16:42.545707 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Feb 13 08:16:42.545712 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 13 08:16:42.545717 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 08:16:42.545722 kernel: Memory: 32724720K/33452980K available (12294K kernel code, 2275K rwdata, 13700K rodata, 45496K init, 4048K bss, 728000K reserved, 0K cma-reserved) Feb 13 08:16:42.545728 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Feb 13 08:16:42.545733 kernel: ftrace: allocating 34475 entries in 135 pages Feb 13 08:16:42.545737 kernel: ftrace: allocated 135 pages with 4 groups Feb 13 08:16:42.545742 kernel: rcu: Hierarchical RCU implementation. Feb 13 08:16:42.545748 kernel: rcu: RCU event tracing is enabled. Feb 13 08:16:42.545753 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Feb 13 08:16:42.545758 kernel: Rude variant of Tasks RCU enabled. Feb 13 08:16:42.545762 kernel: Tracing variant of Tasks RCU enabled. Feb 13 08:16:42.545767 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 08:16:42.545773 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Feb 13 08:16:42.545778 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Feb 13 08:16:42.545782 kernel: random: crng init done Feb 13 08:16:42.545787 kernel: Console: colour dummy device 80x25 Feb 13 08:16:42.545792 kernel: printk: console [tty0] enabled Feb 13 08:16:42.545797 kernel: printk: console [ttyS1] enabled Feb 13 08:16:42.545802 kernel: ACPI: Core revision 20210730 Feb 13 08:16:42.545807 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Feb 13 08:16:42.545811 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 08:16:42.545817 kernel: DMAR: Host address width 39 Feb 13 08:16:42.545822 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Feb 13 08:16:42.545827 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Feb 13 08:16:42.545832 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Feb 13 08:16:42.545836 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Feb 13 08:16:42.545841 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Feb 13 08:16:42.545846 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Feb 13 08:16:42.545851 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Feb 13 08:16:42.545856 kernel: x2apic enabled Feb 13 08:16:42.545861 kernel: Switched APIC routing to cluster x2apic. Feb 13 08:16:42.545866 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Feb 13 08:16:42.545871 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Feb 13 08:16:42.545876 kernel: CPU0: Thermal monitoring enabled (TM1) Feb 13 08:16:42.545881 kernel: process: using mwait in idle threads Feb 13 08:16:42.545885 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 08:16:42.545890 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 08:16:42.545895 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 08:16:42.545900 kernel: Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks! Feb 13 08:16:42.545905 kernel: Spectre V2 : Mitigation: Enhanced IBRS Feb 13 08:16:42.545910 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 08:16:42.545915 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 08:16:42.545920 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 08:16:42.545924 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 08:16:42.545929 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp Feb 13 08:16:42.545934 kernel: TAA: Mitigation: TSX disabled Feb 13 08:16:42.545938 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Feb 13 08:16:42.545943 kernel: SRBDS: Mitigation: Microcode Feb 13 08:16:42.545948 kernel: GDS: Vulnerable: No microcode Feb 13 08:16:42.545953 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 08:16:42.545958 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 08:16:42.545963 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 08:16:42.545968 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Feb 13 08:16:42.545973 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Feb 13 08:16:42.545977 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 08:16:42.545982 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Feb 13 08:16:42.545987 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Feb 13 08:16:42.545992 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Feb 13 08:16:42.545996 kernel: Freeing SMP alternatives memory: 32K Feb 13 08:16:42.546001 kernel: pid_max: default: 32768 minimum: 301 Feb 13 08:16:42.546006 kernel: LSM: Security Framework initializing Feb 13 08:16:42.546011 kernel: SELinux: Initializing. Feb 13 08:16:42.546016 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 08:16:42.546021 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 08:16:42.546026 kernel: smpboot: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Feb 13 08:16:42.546030 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 08:16:42.546035 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Feb 13 08:16:42.546040 kernel: ... version: 4 Feb 13 08:16:42.546045 kernel: ... bit width: 48 Feb 13 08:16:42.546050 kernel: ... generic registers: 4 Feb 13 08:16:42.546054 kernel: ... value mask: 0000ffffffffffff Feb 13 08:16:42.546059 kernel: ... max period: 00007fffffffffff Feb 13 08:16:42.546065 kernel: ... fixed-purpose events: 3 Feb 13 08:16:42.546070 kernel: ... event mask: 000000070000000f Feb 13 08:16:42.546074 kernel: signal: max sigframe size: 2032 Feb 13 08:16:42.546079 kernel: rcu: Hierarchical SRCU implementation. Feb 13 08:16:42.546084 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Feb 13 08:16:42.546089 kernel: smp: Bringing up secondary CPUs ... Feb 13 08:16:42.546094 kernel: x86: Booting SMP configuration: Feb 13 08:16:42.546098 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 Feb 13 08:16:42.546103 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Feb 13 08:16:42.546109 kernel: #9 #10 #11 #12 #13 #14 #15 Feb 13 08:16:42.546114 kernel: smp: Brought up 1 node, 16 CPUs Feb 13 08:16:42.546119 kernel: smpboot: Max logical packages: 1 Feb 13 08:16:42.546124 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Feb 13 08:16:42.546128 kernel: devtmpfs: initialized Feb 13 08:16:42.546133 kernel: x86/mm: Memory block size: 128MB Feb 13 08:16:42.546138 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x820e2000-0x820e2fff] (4096 bytes) Feb 13 08:16:42.546143 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Feb 13 08:16:42.546148 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 08:16:42.546153 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Feb 13 08:16:42.546158 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 08:16:42.546163 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 08:16:42.546168 kernel: audit: initializing netlink subsys (disabled) Feb 13 08:16:42.546172 kernel: audit: type=2000 audit(1707812196.040:1): state=initialized audit_enabled=0 res=1 Feb 13 08:16:42.546177 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 08:16:42.546182 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 08:16:42.546187 kernel: cpuidle: using governor menu Feb 13 08:16:42.546192 kernel: ACPI: bus type PCI registered Feb 13 08:16:42.546197 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 08:16:42.546202 kernel: dca service started, version 1.12.1 Feb 13 08:16:42.546207 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Feb 13 08:16:42.546212 kernel: PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in E820 Feb 13 08:16:42.546216 kernel: PCI: Using configuration type 1 for base access Feb 13 08:16:42.546221 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Feb 13 08:16:42.546226 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 08:16:42.546231 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 08:16:42.546236 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 08:16:42.546241 kernel: ACPI: Added _OSI(Module Device) Feb 13 08:16:42.546246 kernel: ACPI: Added _OSI(Processor Device) Feb 13 08:16:42.546251 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 08:16:42.546256 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 08:16:42.546260 kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 13 08:16:42.546265 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 13 08:16:42.546270 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 13 08:16:42.546275 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Feb 13 08:16:42.546280 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:16:42.546285 kernel: ACPI: SSDT 0xFFFF927AC1353B00 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Feb 13 08:16:42.546290 kernel: ACPI: \_SB_.PR00: _OSC native thermal LVT Acked Feb 13 08:16:42.546296 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:16:42.546300 kernel: ACPI: SSDT 0xFFFF927AC134AC00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Feb 13 08:16:42.546305 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:16:42.546310 kernel: ACPI: SSDT 0xFFFF927AC1B9A800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Feb 13 08:16:42.546315 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:16:42.546319 kernel: ACPI: SSDT 0xFFFF927AC1B9F800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Feb 13 08:16:42.546324 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:16:42.546329 kernel: ACPI: SSDT 0xFFFF927AC1343000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Feb 13 08:16:42.546334 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:16:42.546339 kernel: ACPI: SSDT 0xFFFF927AC134A000 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Feb 13 08:16:42.546344 kernel: ACPI: Interpreter enabled Feb 13 08:16:42.546349 kernel: ACPI: PM: (supports S0 S5) Feb 13 08:16:42.546353 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 08:16:42.546358 kernel: HEST: Enabling Firmware First mode for corrected errors. Feb 13 08:16:42.546363 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Feb 13 08:16:42.546368 kernel: HEST: Table parsing has been initialized. Feb 13 08:16:42.546373 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Feb 13 08:16:42.546378 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 08:16:42.546383 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Feb 13 08:16:42.546388 kernel: ACPI: PM: Power Resource [USBC] Feb 13 08:16:42.546393 kernel: ACPI: PM: Power Resource [V0PR] Feb 13 08:16:42.546397 kernel: ACPI: PM: Power Resource [V1PR] Feb 13 08:16:42.546402 kernel: ACPI: PM: Power Resource [V2PR] Feb 13 08:16:42.546407 kernel: ACPI: PM: Power Resource [WRST] Feb 13 08:16:42.546411 kernel: ACPI: PM: Power Resource [FN00] Feb 13 08:16:42.546417 kernel: ACPI: PM: Power Resource [FN01] Feb 13 08:16:42.546422 kernel: ACPI: PM: Power Resource [FN02] Feb 13 08:16:42.546427 kernel: ACPI: PM: Power Resource [FN03] Feb 13 08:16:42.546431 kernel: ACPI: PM: Power Resource [FN04] Feb 13 08:16:42.546436 kernel: ACPI: PM: Power Resource [PIN] Feb 13 08:16:42.546441 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Feb 13 08:16:42.546506 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 08:16:42.546550 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Feb 13 08:16:42.546592 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Feb 13 08:16:42.546599 kernel: PCI host bridge to bus 0000:00 Feb 13 08:16:42.546644 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 08:16:42.546680 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 08:16:42.546716 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 08:16:42.546751 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Feb 13 08:16:42.546786 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Feb 13 08:16:42.546823 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Feb 13 08:16:42.546871 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 Feb 13 08:16:42.546918 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 Feb 13 08:16:42.546960 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.547004 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 Feb 13 08:16:42.547045 kernel: pci 0000:00:08.0: reg 0x10: [mem 0x9551f000-0x9551ffff 64bit] Feb 13 08:16:42.547091 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 Feb 13 08:16:42.547132 kernel: pci 0000:00:12.0: reg 0x10: [mem 0x9551e000-0x9551efff 64bit] Feb 13 08:16:42.547178 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 Feb 13 08:16:42.547220 kernel: pci 0000:00:14.0: reg 0x10: [mem 0x95500000-0x9550ffff 64bit] Feb 13 08:16:42.547261 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Feb 13 08:16:42.547306 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 Feb 13 08:16:42.547349 kernel: pci 0000:00:14.2: reg 0x10: [mem 0x95512000-0x95513fff 64bit] Feb 13 08:16:42.547388 kernel: pci 0000:00:14.2: reg 0x18: [mem 0x9551d000-0x9551dfff 64bit] Feb 13 08:16:42.547432 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 Feb 13 08:16:42.547475 kernel: pci 0000:00:15.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 08:16:42.547520 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 Feb 13 08:16:42.547560 kernel: pci 0000:00:15.1: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 08:16:42.547606 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 Feb 13 08:16:42.547648 kernel: pci 0000:00:16.0: reg 0x10: [mem 0x9551a000-0x9551afff 64bit] Feb 13 08:16:42.547688 kernel: pci 0000:00:16.0: PME# supported from D3hot Feb 13 08:16:42.547731 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 Feb 13 08:16:42.547771 kernel: pci 0000:00:16.1: reg 0x10: [mem 0x95519000-0x95519fff 64bit] Feb 13 08:16:42.547811 kernel: pci 0000:00:16.1: PME# supported from D3hot Feb 13 08:16:42.547856 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 Feb 13 08:16:42.547899 kernel: pci 0000:00:16.4: reg 0x10: [mem 0x95518000-0x95518fff 64bit] Feb 13 08:16:42.547938 kernel: pci 0000:00:16.4: PME# supported from D3hot Feb 13 08:16:42.547982 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 Feb 13 08:16:42.548021 kernel: pci 0000:00:17.0: reg 0x10: [mem 0x95510000-0x95511fff] Feb 13 08:16:42.548061 kernel: pci 0000:00:17.0: reg 0x14: [mem 0x95517000-0x955170ff] Feb 13 08:16:42.548101 kernel: pci 0000:00:17.0: reg 0x18: [io 0x6050-0x6057] Feb 13 08:16:42.548140 kernel: pci 0000:00:17.0: reg 0x1c: [io 0x6040-0x6043] Feb 13 08:16:42.548187 kernel: pci 0000:00:17.0: reg 0x20: [io 0x6020-0x603f] Feb 13 08:16:42.548229 kernel: pci 0000:00:17.0: reg 0x24: [mem 0x95516000-0x955167ff] Feb 13 08:16:42.548269 kernel: pci 0000:00:17.0: PME# supported from D3hot Feb 13 08:16:42.548313 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 Feb 13 08:16:42.548356 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.548401 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 Feb 13 08:16:42.548442 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.548492 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 Feb 13 08:16:42.548533 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.548578 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 Feb 13 08:16:42.548619 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.548666 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 Feb 13 08:16:42.548709 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.548753 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 Feb 13 08:16:42.548794 kernel: pci 0000:00:1e.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 08:16:42.548840 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 Feb 13 08:16:42.548886 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 Feb 13 08:16:42.548926 kernel: pci 0000:00:1f.4: reg 0x10: [mem 0x95514000-0x955140ff 64bit] Feb 13 08:16:42.548967 kernel: pci 0000:00:1f.4: reg 0x20: [io 0xefa0-0xefbf] Feb 13 08:16:42.549010 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 Feb 13 08:16:42.549051 kernel: pci 0000:00:1f.5: reg 0x10: [mem 0xfe010000-0xfe010fff] Feb 13 08:16:42.549098 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 Feb 13 08:16:42.549140 kernel: pci 0000:01:00.0: reg 0x10: [mem 0x92000000-0x93ffffff 64bit pref] Feb 13 08:16:42.549185 kernel: pci 0000:01:00.0: reg 0x30: [mem 0x95200000-0x952fffff pref] Feb 13 08:16:42.549226 kernel: pci 0000:01:00.0: PME# supported from D3cold Feb 13 08:16:42.549268 kernel: pci 0000:01:00.0: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 08:16:42.549310 kernel: pci 0000:01:00.0: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 08:16:42.549356 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 Feb 13 08:16:42.549398 kernel: pci 0000:01:00.1: reg 0x10: [mem 0x90000000-0x91ffffff 64bit pref] Feb 13 08:16:42.549442 kernel: pci 0000:01:00.1: reg 0x30: [mem 0x95100000-0x951fffff pref] Feb 13 08:16:42.549502 kernel: pci 0000:01:00.1: PME# supported from D3cold Feb 13 08:16:42.549545 kernel: pci 0000:01:00.1: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 08:16:42.549588 kernel: pci 0000:01:00.1: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 08:16:42.549630 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 08:16:42.549671 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 08:16:42.549712 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 08:16:42.549754 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 08:16:42.549806 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 Feb 13 08:16:42.549850 kernel: pci 0000:03:00.0: reg 0x10: [mem 0x95400000-0x9547ffff] Feb 13 08:16:42.549893 kernel: pci 0000:03:00.0: reg 0x18: [io 0x5000-0x501f] Feb 13 08:16:42.549937 kernel: pci 0000:03:00.0: reg 0x1c: [mem 0x95480000-0x95483fff] Feb 13 08:16:42.549979 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.550021 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 08:16:42.550062 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 08:16:42.550103 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 08:16:42.550152 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 Feb 13 08:16:42.550195 kernel: pci 0000:04:00.0: reg 0x10: [mem 0x95300000-0x9537ffff] Feb 13 08:16:42.550237 kernel: pci 0000:04:00.0: reg 0x18: [io 0x4000-0x401f] Feb 13 08:16:42.550279 kernel: pci 0000:04:00.0: reg 0x1c: [mem 0x95380000-0x95383fff] Feb 13 08:16:42.550322 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Feb 13 08:16:42.550363 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 08:16:42.550405 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 08:16:42.550448 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 08:16:42.550514 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 08:16:42.550560 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 Feb 13 08:16:42.550605 kernel: pci 0000:06:00.0: enabling Extended Tags Feb 13 08:16:42.550648 kernel: pci 0000:06:00.0: supports D1 D2 Feb 13 08:16:42.550691 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 08:16:42.550732 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 08:16:42.550773 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 08:16:42.550816 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:16:42.550860 kernel: pci_bus 0000:07: extended config space not accessible Feb 13 08:16:42.550908 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 Feb 13 08:16:42.550952 kernel: pci 0000:07:00.0: reg 0x10: [mem 0x94000000-0x94ffffff] Feb 13 08:16:42.550998 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x95000000-0x9501ffff] Feb 13 08:16:42.551041 kernel: pci 0000:07:00.0: reg 0x18: [io 0x3000-0x307f] Feb 13 08:16:42.551086 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 08:16:42.551185 kernel: pci 0000:07:00.0: supports D1 D2 Feb 13 08:16:42.551231 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 08:16:42.551274 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 08:16:42.551316 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 08:16:42.551358 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:16:42.551366 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Feb 13 08:16:42.551372 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Feb 13 08:16:42.551377 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Feb 13 08:16:42.551383 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Feb 13 08:16:42.551388 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Feb 13 08:16:42.551394 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Feb 13 08:16:42.551399 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Feb 13 08:16:42.551404 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Feb 13 08:16:42.551409 kernel: iommu: Default domain type: Translated Feb 13 08:16:42.551414 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 08:16:42.551458 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Feb 13 08:16:42.551547 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 08:16:42.551592 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Feb 13 08:16:42.551600 kernel: vgaarb: loaded Feb 13 08:16:42.551605 kernel: pps_core: LinuxPPS API ver. 1 registered Feb 13 08:16:42.551610 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 13 08:16:42.551616 kernel: PTP clock support registered Feb 13 08:16:42.551621 kernel: PCI: Using ACPI for IRQ routing Feb 13 08:16:42.551626 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 08:16:42.551631 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Feb 13 08:16:42.551637 kernel: e820: reserve RAM buffer [mem 0x820e2000-0x83ffffff] Feb 13 08:16:42.551642 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Feb 13 08:16:42.551647 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Feb 13 08:16:42.551652 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Feb 13 08:16:42.551657 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Feb 13 08:16:42.551662 kernel: clocksource: Switched to clocksource tsc-early Feb 13 08:16:42.551668 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 08:16:42.551673 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 08:16:42.551678 kernel: pnp: PnP ACPI init Feb 13 08:16:42.551723 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Feb 13 08:16:42.551764 kernel: pnp 00:02: [dma 0 disabled] Feb 13 08:16:42.551804 kernel: pnp 00:03: [dma 0 disabled] Feb 13 08:16:42.551843 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Feb 13 08:16:42.551881 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Feb 13 08:16:42.551920 kernel: system 00:05: [io 0x1854-0x1857] has been reserved Feb 13 08:16:42.551962 kernel: system 00:06: [mem 0xfed10000-0xfed17fff] has been reserved Feb 13 08:16:42.551998 kernel: system 00:06: [mem 0xfed18000-0xfed18fff] has been reserved Feb 13 08:16:42.552035 kernel: system 00:06: [mem 0xfed19000-0xfed19fff] has been reserved Feb 13 08:16:42.552072 kernel: system 00:06: [mem 0xe0000000-0xefffffff] has been reserved Feb 13 08:16:42.552108 kernel: system 00:06: [mem 0xfed20000-0xfed3ffff] has been reserved Feb 13 08:16:42.552144 kernel: system 00:06: [mem 0xfed90000-0xfed93fff] could not be reserved Feb 13 08:16:42.552180 kernel: system 00:06: [mem 0xfed45000-0xfed8ffff] has been reserved Feb 13 08:16:42.552218 kernel: system 00:06: [mem 0xfee00000-0xfeefffff] could not be reserved Feb 13 08:16:42.552259 kernel: system 00:07: [io 0x1800-0x18fe] could not be reserved Feb 13 08:16:42.552296 kernel: system 00:07: [mem 0xfd000000-0xfd69ffff] has been reserved Feb 13 08:16:42.552332 kernel: system 00:07: [mem 0xfd6c0000-0xfd6cffff] has been reserved Feb 13 08:16:42.552368 kernel: system 00:07: [mem 0xfd6f0000-0xfdffffff] has been reserved Feb 13 08:16:42.552404 kernel: system 00:07: [mem 0xfe000000-0xfe01ffff] could not be reserved Feb 13 08:16:42.552441 kernel: system 00:07: [mem 0xfe200000-0xfe7fffff] has been reserved Feb 13 08:16:42.552503 kernel: system 00:07: [mem 0xff000000-0xffffffff] has been reserved Feb 13 08:16:42.552562 kernel: system 00:08: [io 0x2000-0x20fe] has been reserved Feb 13 08:16:42.552570 kernel: pnp: PnP ACPI: found 10 devices Feb 13 08:16:42.552575 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 08:16:42.552581 kernel: NET: Registered PF_INET protocol family Feb 13 08:16:42.552586 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 08:16:42.552591 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Feb 13 08:16:42.552596 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 08:16:42.552603 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 08:16:42.552608 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 13 08:16:42.552614 kernel: TCP: Hash tables configured (established 262144 bind 65536) Feb 13 08:16:42.552619 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 08:16:42.552624 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 08:16:42.552629 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 08:16:42.552634 kernel: NET: Registered PF_XDP protocol family Feb 13 08:16:42.552676 kernel: pci 0000:00:15.0: BAR 0: assigned [mem 0x95515000-0x95515fff 64bit] Feb 13 08:16:42.552719 kernel: pci 0000:00:15.1: BAR 0: assigned [mem 0x9551b000-0x9551bfff 64bit] Feb 13 08:16:42.552760 kernel: pci 0000:00:1e.0: BAR 0: assigned [mem 0x9551c000-0x9551cfff 64bit] Feb 13 08:16:42.552802 kernel: pci 0000:01:00.0: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 08:16:42.552845 kernel: pci 0000:01:00.0: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 08:16:42.552888 kernel: pci 0000:01:00.1: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 08:16:42.552929 kernel: pci 0000:01:00.1: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 08:16:42.552970 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 08:16:42.553012 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 08:16:42.553055 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 08:16:42.553096 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 08:16:42.553136 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 08:16:42.553177 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 08:16:42.553218 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 08:16:42.553261 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 08:16:42.553301 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 08:16:42.553342 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 08:16:42.553383 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 08:16:42.553425 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 08:16:42.553468 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 08:16:42.553534 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:16:42.553576 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 08:16:42.553618 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 08:16:42.553661 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:16:42.553698 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Feb 13 08:16:42.553735 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 08:16:42.553771 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 08:16:42.553807 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 08:16:42.553842 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Feb 13 08:16:42.553877 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Feb 13 08:16:42.553920 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Feb 13 08:16:42.553961 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 08:16:42.554003 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Feb 13 08:16:42.554042 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Feb 13 08:16:42.554085 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Feb 13 08:16:42.554123 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Feb 13 08:16:42.554167 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Feb 13 08:16:42.554208 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Feb 13 08:16:42.554248 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Feb 13 08:16:42.554289 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Feb 13 08:16:42.554297 kernel: PCI: CLS 64 bytes, default 64 Feb 13 08:16:42.554303 kernel: DMAR: No ATSR found Feb 13 08:16:42.554308 kernel: DMAR: No SATC found Feb 13 08:16:42.554314 kernel: DMAR: dmar0: Using Queued invalidation Feb 13 08:16:42.554355 kernel: pci 0000:00:00.0: Adding to iommu group 0 Feb 13 08:16:42.554401 kernel: pci 0000:00:01.0: Adding to iommu group 1 Feb 13 08:16:42.554443 kernel: pci 0000:00:08.0: Adding to iommu group 2 Feb 13 08:16:42.554487 kernel: pci 0000:00:12.0: Adding to iommu group 3 Feb 13 08:16:42.554528 kernel: pci 0000:00:14.0: Adding to iommu group 4 Feb 13 08:16:42.554568 kernel: pci 0000:00:14.2: Adding to iommu group 4 Feb 13 08:16:42.554610 kernel: pci 0000:00:15.0: Adding to iommu group 5 Feb 13 08:16:42.554650 kernel: pci 0000:00:15.1: Adding to iommu group 5 Feb 13 08:16:42.554692 kernel: pci 0000:00:16.0: Adding to iommu group 6 Feb 13 08:16:42.554735 kernel: pci 0000:00:16.1: Adding to iommu group 6 Feb 13 08:16:42.554777 kernel: pci 0000:00:16.4: Adding to iommu group 6 Feb 13 08:16:42.554817 kernel: pci 0000:00:17.0: Adding to iommu group 7 Feb 13 08:16:42.554859 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Feb 13 08:16:42.554900 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Feb 13 08:16:42.554941 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Feb 13 08:16:42.554983 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Feb 13 08:16:42.555024 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Feb 13 08:16:42.555067 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Feb 13 08:16:42.555108 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Feb 13 08:16:42.555150 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Feb 13 08:16:42.555190 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Feb 13 08:16:42.555233 kernel: pci 0000:01:00.0: Adding to iommu group 1 Feb 13 08:16:42.555276 kernel: pci 0000:01:00.1: Adding to iommu group 1 Feb 13 08:16:42.555318 kernel: pci 0000:03:00.0: Adding to iommu group 15 Feb 13 08:16:42.555362 kernel: pci 0000:04:00.0: Adding to iommu group 16 Feb 13 08:16:42.555406 kernel: pci 0000:06:00.0: Adding to iommu group 17 Feb 13 08:16:42.555451 kernel: pci 0000:07:00.0: Adding to iommu group 17 Feb 13 08:16:42.555459 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Feb 13 08:16:42.555465 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 13 08:16:42.555472 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Feb 13 08:16:42.555478 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Feb 13 08:16:42.555483 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Feb 13 08:16:42.555488 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Feb 13 08:16:42.555495 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Feb 13 08:16:42.555539 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Feb 13 08:16:42.555547 kernel: Initialise system trusted keyrings Feb 13 08:16:42.555552 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Feb 13 08:16:42.555558 kernel: Key type asymmetric registered Feb 13 08:16:42.555563 kernel: Asymmetric key parser 'x509' registered Feb 13 08:16:42.555568 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Feb 13 08:16:42.555573 kernel: io scheduler mq-deadline registered Feb 13 08:16:42.555580 kernel: io scheduler kyber registered Feb 13 08:16:42.555585 kernel: io scheduler bfq registered Feb 13 08:16:42.555627 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Feb 13 08:16:42.555668 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Feb 13 08:16:42.555710 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Feb 13 08:16:42.555751 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Feb 13 08:16:42.555792 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Feb 13 08:16:42.555834 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Feb 13 08:16:42.555882 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Feb 13 08:16:42.555890 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Feb 13 08:16:42.555896 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Feb 13 08:16:42.555901 kernel: pstore: Registered erst as persistent store backend Feb 13 08:16:42.555906 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 08:16:42.555912 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 08:16:42.555917 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 08:16:42.555922 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Feb 13 08:16:42.555929 kernel: hpet_acpi_add: no address or irqs in _CRS Feb 13 08:16:42.555971 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Feb 13 08:16:42.555979 kernel: i8042: PNP: No PS/2 controller found. Feb 13 08:16:42.556016 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Feb 13 08:16:42.556055 kernel: rtc_cmos rtc_cmos: registered as rtc0 Feb 13 08:16:42.556092 kernel: rtc_cmos rtc_cmos: setting system clock to 2024-02-13T08:16:41 UTC (1707812201) Feb 13 08:16:42.556130 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Feb 13 08:16:42.556138 kernel: fail to initialize ptp_kvm Feb 13 08:16:42.556145 kernel: intel_pstate: Intel P-state driver initializing Feb 13 08:16:42.556150 kernel: intel_pstate: Disabling energy efficiency optimization Feb 13 08:16:42.556155 kernel: intel_pstate: HWP enabled Feb 13 08:16:42.556161 kernel: vesafb: mode is 1024x768x8, linelength=1024, pages=0 Feb 13 08:16:42.556166 kernel: vesafb: scrolling: redraw Feb 13 08:16:42.556171 kernel: vesafb: Pseudocolor: size=0:8:8:8, shift=0:0:0:0 Feb 13 08:16:42.556176 kernel: vesafb: framebuffer at 0x94000000, mapped to 0x00000000f0e96ffb, using 768k, total 768k Feb 13 08:16:42.556182 kernel: Console: switching to colour frame buffer device 128x48 Feb 13 08:16:42.556187 kernel: fb0: VESA VGA frame buffer device Feb 13 08:16:42.556193 kernel: NET: Registered PF_INET6 protocol family Feb 13 08:16:42.556198 kernel: Segment Routing with IPv6 Feb 13 08:16:42.556203 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 08:16:42.556209 kernel: NET: Registered PF_PACKET protocol family Feb 13 08:16:42.556214 kernel: Key type dns_resolver registered Feb 13 08:16:42.556219 kernel: microcode: sig=0x906ed, pf=0x2, revision=0xf4 Feb 13 08:16:42.556224 kernel: microcode: Microcode Update Driver: v2.2. Feb 13 08:16:42.556229 kernel: IPI shorthand broadcast: enabled Feb 13 08:16:42.556235 kernel: sched_clock: Marking stable (2075332715, 1339072979)->(4436565083, -1022159389) Feb 13 08:16:42.556241 kernel: registered taskstats version 1 Feb 13 08:16:42.556246 kernel: Loading compiled-in X.509 certificates Feb 13 08:16:42.556251 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.148-flatcar: 253e5c5c936b12e2ff2626e7f3214deb753330c8' Feb 13 08:16:42.556257 kernel: Key type .fscrypt registered Feb 13 08:16:42.556262 kernel: Key type fscrypt-provisioning registered Feb 13 08:16:42.556267 kernel: pstore: Using crash dump compression: deflate Feb 13 08:16:42.556272 kernel: ima: Allocated hash algorithm: sha1 Feb 13 08:16:42.556278 kernel: ima: No architecture policies found Feb 13 08:16:42.556283 kernel: Freeing unused kernel image (initmem) memory: 45496K Feb 13 08:16:42.556289 kernel: Write protecting the kernel read-only data: 28672k Feb 13 08:16:42.556294 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 13 08:16:42.556299 kernel: Freeing unused kernel image (rodata/data gap) memory: 636K Feb 13 08:16:42.556305 kernel: Run /init as init process Feb 13 08:16:42.556310 kernel: with arguments: Feb 13 08:16:42.556315 kernel: /init Feb 13 08:16:42.556320 kernel: with environment: Feb 13 08:16:42.556326 kernel: HOME=/ Feb 13 08:16:42.556331 kernel: TERM=linux Feb 13 08:16:42.556336 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 08:16:42.556343 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 08:16:42.556350 systemd[1]: Detected architecture x86-64. Feb 13 08:16:42.556355 systemd[1]: Running in initrd. Feb 13 08:16:42.556361 systemd[1]: No hostname configured, using default hostname. Feb 13 08:16:42.556366 systemd[1]: Hostname set to . Feb 13 08:16:42.556371 systemd[1]: Initializing machine ID from random generator. Feb 13 08:16:42.556378 systemd[1]: Queued start job for default target initrd.target. Feb 13 08:16:42.556383 systemd[1]: Started systemd-ask-password-console.path. Feb 13 08:16:42.556389 systemd[1]: Reached target cryptsetup.target. Feb 13 08:16:42.556394 systemd[1]: Reached target paths.target. Feb 13 08:16:42.556399 systemd[1]: Reached target slices.target. Feb 13 08:16:42.556405 systemd[1]: Reached target swap.target. Feb 13 08:16:42.556410 systemd[1]: Reached target timers.target. Feb 13 08:16:42.556415 systemd[1]: Listening on iscsid.socket. Feb 13 08:16:42.556422 systemd[1]: Listening on iscsiuio.socket. Feb 13 08:16:42.556428 systemd[1]: Listening on systemd-journald-audit.socket. Feb 13 08:16:42.556433 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 13 08:16:42.556439 systemd[1]: Listening on systemd-journald.socket. Feb 13 08:16:42.556444 systemd[1]: Listening on systemd-networkd.socket. Feb 13 08:16:42.556450 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 08:16:42.556455 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 08:16:42.556461 kernel: tsc: Refined TSC clocksource calibration: 3407.998 MHz Feb 13 08:16:42.556469 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd208cfc, max_idle_ns: 440795283699 ns Feb 13 08:16:42.556474 systemd[1]: Reached target sockets.target. Feb 13 08:16:42.556480 kernel: clocksource: Switched to clocksource tsc Feb 13 08:16:42.556485 systemd[1]: Starting kmod-static-nodes.service... Feb 13 08:16:42.556491 systemd[1]: Finished network-cleanup.service. Feb 13 08:16:42.556496 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 08:16:42.556502 systemd[1]: Starting systemd-journald.service... Feb 13 08:16:42.556507 systemd[1]: Starting systemd-modules-load.service... Feb 13 08:16:42.556514 systemd-journald[267]: Journal started Feb 13 08:16:42.556541 systemd-journald[267]: Runtime Journal (/run/log/journal/4a4b544a434345e984306c8067d731cd) is 8.0M, max 640.1M, 632.1M free. Feb 13 08:16:42.559903 systemd-modules-load[268]: Inserted module 'overlay' Feb 13 08:16:42.565000 audit: BPF prog-id=6 op=LOAD Feb 13 08:16:42.584510 kernel: audit: type=1334 audit(1707812202.565:2): prog-id=6 op=LOAD Feb 13 08:16:42.584540 systemd[1]: Starting systemd-resolved.service... Feb 13 08:16:42.633512 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 08:16:42.633546 systemd[1]: Starting systemd-vconsole-setup.service... Feb 13 08:16:42.665503 kernel: Bridge firewalling registered Feb 13 08:16:42.665520 systemd[1]: Started systemd-journald.service. Feb 13 08:16:42.679414 systemd-modules-load[268]: Inserted module 'br_netfilter' Feb 13 08:16:42.727866 kernel: audit: type=1130 audit(1707812202.686:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.685617 systemd-resolved[270]: Positive Trust Anchors: Feb 13 08:16:42.790508 kernel: SCSI subsystem initialized Feb 13 08:16:42.790518 kernel: audit: type=1130 audit(1707812202.738:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.790528 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 08:16:42.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.685624 systemd-resolved[270]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 08:16:42.902110 kernel: device-mapper: uevent: version 1.0.3 Feb 13 08:16:42.902123 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Feb 13 08:16:42.902130 kernel: audit: type=1130 audit(1707812202.858:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.858000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.685643 systemd-resolved[270]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 08:16:42.975715 kernel: audit: type=1130 audit(1707812202.909:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.687194 systemd-resolved[270]: Defaulting to hostname 'linux'. Feb 13 08:16:42.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.687693 systemd[1]: Started systemd-resolved.service. Feb 13 08:16:43.083155 kernel: audit: type=1130 audit(1707812202.983:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.083166 kernel: audit: type=1130 audit(1707812203.036:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.036000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:42.739650 systemd[1]: Finished kmod-static-nodes.service. Feb 13 08:16:42.860001 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 08:16:42.902609 systemd-modules-load[268]: Inserted module 'dm_multipath' Feb 13 08:16:42.910763 systemd[1]: Finished systemd-modules-load.service. Feb 13 08:16:42.984816 systemd[1]: Finished systemd-vconsole-setup.service. Feb 13 08:16:43.037749 systemd[1]: Reached target nss-lookup.target. Feb 13 08:16:43.092047 systemd[1]: Starting dracut-cmdline-ask.service... Feb 13 08:16:43.111972 systemd[1]: Starting systemd-sysctl.service... Feb 13 08:16:43.121215 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 13 08:16:43.122009 systemd[1]: Finished systemd-sysctl.service. Feb 13 08:16:43.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.124526 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 13 08:16:43.169674 kernel: audit: type=1130 audit(1707812203.120:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.183000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.184804 systemd[1]: Finished dracut-cmdline-ask.service. Feb 13 08:16:43.251590 kernel: audit: type=1130 audit(1707812203.183:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.241000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.243086 systemd[1]: Starting dracut-cmdline.service... Feb 13 08:16:43.267576 dracut-cmdline[291]: dracut-dracut-053 Feb 13 08:16:43.267576 dracut-cmdline[291]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LA Feb 13 08:16:43.267576 dracut-cmdline[291]: BEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 08:16:43.335554 kernel: Loading iSCSI transport class v2.0-870. Feb 13 08:16:43.335566 kernel: iscsi: registered transport (tcp) Feb 13 08:16:43.382504 kernel: iscsi: registered transport (qla4xxx) Feb 13 08:16:43.382523 kernel: QLogic iSCSI HBA Driver Feb 13 08:16:43.398552 systemd[1]: Finished dracut-cmdline.service. Feb 13 08:16:43.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:43.399072 systemd[1]: Starting dracut-pre-udev.service... Feb 13 08:16:43.454545 kernel: raid6: avx2x4 gen() 48963 MB/s Feb 13 08:16:43.489544 kernel: raid6: avx2x4 xor() 22335 MB/s Feb 13 08:16:43.524549 kernel: raid6: avx2x2 gen() 53843 MB/s Feb 13 08:16:43.559503 kernel: raid6: avx2x2 xor() 32125 MB/s Feb 13 08:16:43.594544 kernel: raid6: avx2x1 gen() 45272 MB/s Feb 13 08:16:43.629548 kernel: raid6: avx2x1 xor() 27943 MB/s Feb 13 08:16:43.664502 kernel: raid6: sse2x4 gen() 21367 MB/s Feb 13 08:16:43.698544 kernel: raid6: sse2x4 xor() 11983 MB/s Feb 13 08:16:43.732499 kernel: raid6: sse2x2 gen() 21660 MB/s Feb 13 08:16:43.766502 kernel: raid6: sse2x2 xor() 13473 MB/s Feb 13 08:16:43.800544 kernel: raid6: sse2x1 gen() 18304 MB/s Feb 13 08:16:43.852082 kernel: raid6: sse2x1 xor() 8931 MB/s Feb 13 08:16:43.852097 kernel: raid6: using algorithm avx2x2 gen() 53843 MB/s Feb 13 08:16:43.852105 kernel: raid6: .... xor() 32125 MB/s, rmw enabled Feb 13 08:16:43.870128 kernel: raid6: using avx2x2 recovery algorithm Feb 13 08:16:43.916472 kernel: xor: automatically using best checksumming function avx Feb 13 08:16:43.994481 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Feb 13 08:16:43.999847 systemd[1]: Finished dracut-pre-udev.service. Feb 13 08:16:44.007000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:44.007000 audit: BPF prog-id=7 op=LOAD Feb 13 08:16:44.007000 audit: BPF prog-id=8 op=LOAD Feb 13 08:16:44.009388 systemd[1]: Starting systemd-udevd.service... Feb 13 08:16:44.016971 systemd-udevd[472]: Using default interface naming scheme 'v252'. Feb 13 08:16:44.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:44.024868 systemd[1]: Started systemd-udevd.service. Feb 13 08:16:44.066601 dracut-pre-trigger[485]: rd.md=0: removing MD RAID activation Feb 13 08:16:44.041220 systemd[1]: Starting dracut-pre-trigger.service... Feb 13 08:16:44.082000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:44.068239 systemd[1]: Finished dracut-pre-trigger.service. Feb 13 08:16:44.084335 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 08:16:44.132027 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 08:16:44.130000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:44.160477 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 08:16:44.187479 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 08:16:44.187516 kernel: libata version 3.00 loaded. Feb 13 08:16:44.187525 kernel: ACPI: bus type USB registered Feb 13 08:16:44.221881 kernel: usbcore: registered new interface driver usbfs Feb 13 08:16:44.221897 kernel: usbcore: registered new interface driver hub Feb 13 08:16:44.239513 kernel: AES CTR mode by8 optimization enabled Feb 13 08:16:44.239529 kernel: usbcore: registered new device driver usb Feb 13 08:16:44.273472 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Feb 13 08:16:44.307296 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Feb 13 08:16:44.327499 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 08:16:44.327587 kernel: mlx5_core 0000:01:00.0: firmware version: 14.28.2006 Feb 13 08:16:44.327644 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Feb 13 08:16:44.348497 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 08:16:44.348572 kernel: pps pps0: new PPS source ptp0 Feb 13 08:16:44.348632 kernel: igb 0000:03:00.0: added PHC on eth0 Feb 13 08:16:44.348688 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 08:16:44.348738 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d8:82 Feb 13 08:16:44.348787 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Feb 13 08:16:44.348834 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 08:16:44.382503 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Feb 13 08:16:44.396476 kernel: pps pps1: new PPS source ptp1 Feb 13 08:16:44.412485 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 08:16:44.412554 kernel: igb 0000:04:00.0: added PHC on eth1 Feb 13 08:16:44.412610 kernel: ahci 0000:00:17.0: version 3.0 Feb 13 08:16:44.412662 kernel: ahci 0000:00:17.0: AHCI 0001.0301 32 slots 7 ports 6 Gbps 0x7f impl SATA mode Feb 13 08:16:44.412710 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Feb 13 08:16:44.428523 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Feb 13 08:16:44.432474 kernel: scsi host0: ahci Feb 13 08:16:44.432562 kernel: scsi host1: ahci Feb 13 08:16:44.432618 kernel: scsi host2: ahci Feb 13 08:16:44.432674 kernel: scsi host3: ahci Feb 13 08:16:44.432754 kernel: scsi host4: ahci Feb 13 08:16:44.432834 kernel: scsi host5: ahci Feb 13 08:16:44.432889 kernel: scsi host6: ahci Feb 13 08:16:44.432941 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 138 Feb 13 08:16:44.432949 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 138 Feb 13 08:16:44.432955 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 138 Feb 13 08:16:44.432961 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 138 Feb 13 08:16:44.432968 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 138 Feb 13 08:16:44.432974 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 138 Feb 13 08:16:44.432980 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 138 Feb 13 08:16:44.460426 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 08:16:44.460506 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Feb 13 08:16:44.477411 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:70:d8:83 Feb 13 08:16:44.508872 kernel: hub 1-0:1.0: USB hub found Feb 13 08:16:44.522861 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Feb 13 08:16:44.522933 kernel: hub 1-0:1.0: 16 ports detected Feb 13 08:16:44.535806 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 08:16:44.570551 kernel: hub 2-0:1.0: USB hub found Feb 13 08:16:44.646515 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Feb 13 08:16:44.646590 kernel: hub 2-0:1.0: 10 ports detected Feb 13 08:16:44.701129 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 08:16:44.701205 kernel: usb: port power management may be unreliable Feb 13 08:16:44.746479 kernel: ata5: SATA link down (SStatus 0 SControl 300) Feb 13 08:16:44.868519 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Feb 13 08:16:44.868545 kernel: ata6: SATA link down (SStatus 0 SControl 300) Feb 13 08:16:44.971511 kernel: ata4: SATA link down (SStatus 0 SControl 300) Feb 13 08:16:44.985566 kernel: ata7: SATA link down (SStatus 0 SControl 300) Feb 13 08:16:44.999537 kernel: mlx5_core 0000:01:00.0: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 08:16:44.999607 kernel: ata3: SATA link down (SStatus 0 SControl 300) Feb 13 08:16:45.012525 kernel: hub 1-14:1.0: USB hub found Feb 13 08:16:45.012667 kernel: hub 1-14:1.0: 4 ports detected Feb 13 08:16:45.031546 kernel: mlx5_core 0000:01:00.1: firmware version: 14.28.2006 Feb 13 08:16:45.031632 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 08:16:45.055229 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 08:16:45.055297 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 08:16:45.116507 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Feb 13 08:16:45.132499 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Feb 13 08:16:45.178806 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 08:16:45.178821 kernel: ata2.00: Features: NCQ-prio Feb 13 08:16:45.178828 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 08:16:45.206137 kernel: ata1.00: Features: NCQ-prio Feb 13 08:16:45.223527 kernel: ata2.00: configured for UDMA/133 Feb 13 08:16:45.223544 kernel: ata1.00: configured for UDMA/133 Feb 13 08:16:45.235542 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Feb 13 08:16:45.251490 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Feb 13 08:16:45.283472 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Feb 13 08:16:45.301235 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:45.301270 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 08:16:45.315481 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Feb 13 08:16:45.315579 kernel: sd 0:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 08:16:45.315772 kernel: sd 1:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 08:16:45.315941 kernel: sd 1:0:0:0: [sdb] 4096-byte physical blocks Feb 13 08:16:45.316087 kernel: sd 1:0:0:0: [sdb] Write Protect is off Feb 13 08:16:45.316239 kernel: sd 1:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Feb 13 08:16:45.316394 kernel: sd 1:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 08:16:45.316632 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 08:16:45.316651 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 08:16:45.316664 kernel: sd 1:0:0:0: [sdb] Attached SCSI disk Feb 13 08:16:45.376372 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Feb 13 08:16:45.376446 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Feb 13 08:16:45.408002 kernel: port_module: 9 callbacks suppressed Feb 13 08:16:45.408021 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Feb 13 08:16:45.408094 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 08:16:45.497087 kernel: hid: raw HID events driver (C) Jiri Kosina Feb 13 08:16:45.497103 kernel: sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 Feb 13 08:16:45.543530 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 08:16:45.576533 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Feb 13 08:16:45.576610 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:45.622589 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 08:16:45.622605 kernel: GPT:9289727 != 937703087 Feb 13 08:16:45.622612 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 08:16:45.637825 kernel: GPT:9289727 != 937703087 Feb 13 08:16:45.650525 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 08:16:45.664767 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 08:16:45.693386 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:45.693401 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 08:16:45.693475 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 08:16:45.756110 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Feb 13 08:16:45.831683 kernel: usbcore: registered new interface driver usbhid Feb 13 08:16:45.831698 kernel: usbhid: USB HID core driver Feb 13 08:16:45.831705 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (531) Feb 13 08:16:45.831715 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Feb 13 08:16:45.797593 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Feb 13 08:16:45.826569 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Feb 13 08:16:45.847350 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Feb 13 08:16:45.998663 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Feb 13 08:16:45.998768 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Feb 13 08:16:45.998777 kernel: mlx5_core 0000:01:00.1: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 08:16:45.998832 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Feb 13 08:16:45.998891 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth0 Feb 13 08:16:45.998941 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:45.878520 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 08:16:46.055511 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 08:16:46.055524 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:45.984965 systemd[1]: Starting disk-uuid.service... Feb 13 08:16:46.074512 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 08:16:46.074524 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth2 Feb 13 08:16:46.074590 disk-uuid[676]: Primary Header is updated. Feb 13 08:16:46.074590 disk-uuid[676]: Secondary Entries is updated. Feb 13 08:16:46.074590 disk-uuid[676]: Secondary Header is updated. Feb 13 08:16:46.148505 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:46.148520 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 08:16:47.097876 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:16:47.117346 disk-uuid[677]: The operation has completed successfully. Feb 13 08:16:47.125557 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 08:16:47.153735 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 08:16:47.252482 kernel: audit: type=1130 audit(1707812207.159:19): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.252497 kernel: audit: type=1131 audit(1707812207.159:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.159000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.159000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.153780 systemd[1]: Finished disk-uuid.service. Feb 13 08:16:47.282575 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 08:16:47.169505 systemd[1]: Starting verity-setup.service... Feb 13 08:16:47.355951 systemd[1]: Found device dev-mapper-usr.device. Feb 13 08:16:47.367711 systemd[1]: Mounting sysusr-usr.mount... Feb 13 08:16:47.380083 systemd[1]: Finished verity-setup.service. Feb 13 08:16:47.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.448476 kernel: audit: type=1130 audit(1707812207.393:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.505000 systemd[1]: Mounted sysusr-usr.mount. Feb 13 08:16:47.519588 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Feb 13 08:16:47.511741 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Feb 13 08:16:47.605209 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 08:16:47.605225 kernel: BTRFS info (device sda6): using free space tree Feb 13 08:16:47.605232 kernel: BTRFS info (device sda6): has skinny extents Feb 13 08:16:47.605239 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 08:16:47.512143 systemd[1]: Starting ignition-setup.service... Feb 13 08:16:47.532808 systemd[1]: Starting parse-ip-for-networkd.service... Feb 13 08:16:47.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.614073 systemd[1]: Finished ignition-setup.service. Feb 13 08:16:47.737287 kernel: audit: type=1130 audit(1707812207.628:22): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.737301 kernel: audit: type=1130 audit(1707812207.686:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.629812 systemd[1]: Finished parse-ip-for-networkd.service. Feb 13 08:16:47.769549 kernel: audit: type=1334 audit(1707812207.745:24): prog-id=9 op=LOAD Feb 13 08:16:47.745000 audit: BPF prog-id=9 op=LOAD Feb 13 08:16:47.688109 systemd[1]: Starting ignition-fetch-offline.service... Feb 13 08:16:47.747411 systemd[1]: Starting systemd-networkd.service... Feb 13 08:16:47.783874 systemd-networkd[878]: lo: Link UP Feb 13 08:16:47.850525 kernel: audit: type=1130 audit(1707812207.798:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.798000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.813009 ignition[867]: Ignition 2.14.0 Feb 13 08:16:47.783876 systemd-networkd[878]: lo: Gained carrier Feb 13 08:16:47.813013 ignition[867]: Stage: fetch-offline Feb 13 08:16:47.784159 systemd-networkd[878]: Enumeration completed Feb 13 08:16:47.891000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.813039 ignition[867]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:16:48.028267 kernel: audit: type=1130 audit(1707812207.891:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:48.028281 kernel: audit: type=1130 audit(1707812207.952:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:48.028289 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 08:16:47.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.784238 systemd[1]: Started systemd-networkd.service. Feb 13 08:16:48.053349 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f1np1: link becomes ready Feb 13 08:16:47.813052 ignition[867]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:16:47.784864 systemd-networkd[878]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:16:47.821649 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:16:48.083000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.819726 systemd[1]: Reached target network.target. Feb 13 08:16:48.108592 iscsid[908]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 13 08:16:48.108592 iscsid[908]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log Feb 13 08:16:48.108592 iscsid[908]: into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 13 08:16:48.108592 iscsid[908]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 13 08:16:48.108592 iscsid[908]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 13 08:16:48.108592 iscsid[908]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 13 08:16:48.108592 iscsid[908]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 13 08:16:48.115000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:48.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:47.821714 ignition[867]: parsed url from cmdline: "" Feb 13 08:16:47.847583 unknown[867]: fetched base config from "system" Feb 13 08:16:48.291574 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 08:16:47.821716 ignition[867]: no config URL provided Feb 13 08:16:47.847589 unknown[867]: fetched user config from "system" Feb 13 08:16:47.821719 ignition[867]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 08:16:47.860204 systemd[1]: Starting iscsiuio.service... Feb 13 08:16:47.821748 ignition[867]: parsing config with SHA512: ebcc7c2532b1ea61e1b2fdd1cf780cdba5c7e0cce01e4a5b3142b31e75e72eeede5d0c434671bef7c63bb507eee9b3774c6520c92a25f91b154af1c03964fd8f Feb 13 08:16:47.877794 systemd[1]: Started iscsiuio.service. Feb 13 08:16:47.847993 ignition[867]: fetch-offline: fetch-offline passed Feb 13 08:16:47.892839 systemd[1]: Finished ignition-fetch-offline.service. Feb 13 08:16:47.847998 ignition[867]: POST message to Packet Timeline Feb 13 08:16:47.953742 systemd[1]: ignition-fetch.service was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 08:16:47.848004 ignition[867]: POST Status error: resource requires networking Feb 13 08:16:47.954155 systemd[1]: Starting ignition-kargs.service... Feb 13 08:16:47.848040 ignition[867]: Ignition finished successfully Feb 13 08:16:48.030216 systemd-networkd[878]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:16:48.032489 ignition[897]: Ignition 2.14.0 Feb 13 08:16:48.042981 systemd[1]: Starting iscsid.service... Feb 13 08:16:48.032518 ignition[897]: Stage: kargs Feb 13 08:16:48.064667 systemd[1]: Started iscsid.service. Feb 13 08:16:48.032588 ignition[897]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:16:48.085043 systemd[1]: Starting dracut-initqueue.service... Feb 13 08:16:48.032598 ignition[897]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:16:48.101727 systemd[1]: Finished dracut-initqueue.service. Feb 13 08:16:48.035239 ignition[897]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:16:48.116692 systemd[1]: Reached target remote-fs-pre.target. Feb 13 08:16:48.036482 ignition[897]: kargs: kargs passed Feb 13 08:16:48.135672 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 08:16:48.036491 ignition[897]: POST message to Packet Timeline Feb 13 08:16:48.135780 systemd[1]: Reached target remote-fs.target. Feb 13 08:16:48.036512 ignition[897]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:16:48.170578 systemd[1]: Starting dracut-pre-mount.service... Feb 13 08:16:48.038633 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:36231->[::1]:53: read: connection refused Feb 13 08:16:48.205731 systemd[1]: Finished dracut-pre-mount.service. Feb 13 08:16:48.239017 ignition[897]: GET https://metadata.packet.net/metadata: attempt #2 Feb 13 08:16:48.286424 systemd-networkd[878]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:16:48.239410 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:48546->[::1]:53: read: connection refused Feb 13 08:16:48.314617 systemd-networkd[878]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:16:48.343101 systemd-networkd[878]: enp1s0f1np1: Link UP Feb 13 08:16:48.343296 systemd-networkd[878]: enp1s0f1np1: Gained carrier Feb 13 08:16:48.355922 systemd-networkd[878]: enp1s0f0np0: Link UP Feb 13 08:16:48.356246 systemd-networkd[878]: eno2: Link UP Feb 13 08:16:48.356561 systemd-networkd[878]: eno1: Link UP Feb 13 08:16:48.640217 ignition[897]: GET https://metadata.packet.net/metadata: attempt #3 Feb 13 08:16:48.641353 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:49809->[::1]:53: read: connection refused Feb 13 08:16:49.108488 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f0np0: link becomes ready Feb 13 08:16:49.108441 systemd-networkd[878]: enp1s0f0np0: Gained carrier Feb 13 08:16:49.141885 systemd-networkd[878]: enp1s0f0np0: DHCPv4 address 145.40.67.79/31, gateway 145.40.67.78 acquired from 145.40.83.140 Feb 13 08:16:49.405069 systemd-networkd[878]: enp1s0f1np1: Gained IPv6LL Feb 13 08:16:49.441892 ignition[897]: GET https://metadata.packet.net/metadata: attempt #4 Feb 13 08:16:49.443102 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:46665->[::1]:53: read: connection refused Feb 13 08:16:50.365056 systemd-networkd[878]: enp1s0f0np0: Gained IPv6LL Feb 13 08:16:51.044862 ignition[897]: GET https://metadata.packet.net/metadata: attempt #5 Feb 13 08:16:51.046199 ignition[897]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50720->[::1]:53: read: connection refused Feb 13 08:16:54.249510 ignition[897]: GET https://metadata.packet.net/metadata: attempt #6 Feb 13 08:16:54.291973 ignition[897]: GET result: OK Feb 13 08:16:54.646656 ignition[897]: Ignition finished successfully Feb 13 08:16:54.651092 systemd[1]: Finished ignition-kargs.service. Feb 13 08:16:54.738119 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:16:54.738141 kernel: audit: type=1130 audit(1707812214.660:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:54.660000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:54.671035 ignition[925]: Ignition 2.14.0 Feb 13 08:16:54.663735 systemd[1]: Starting ignition-disks.service... Feb 13 08:16:54.671062 ignition[925]: Stage: disks Feb 13 08:16:54.671136 ignition[925]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:16:54.671146 ignition[925]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:16:54.672703 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:16:54.674388 ignition[925]: disks: disks passed Feb 13 08:16:54.674391 ignition[925]: POST message to Packet Timeline Feb 13 08:16:54.674400 ignition[925]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:16:54.757730 ignition[925]: GET result: OK Feb 13 08:16:54.987099 ignition[925]: Ignition finished successfully Feb 13 08:16:54.990171 systemd[1]: Finished ignition-disks.service. Feb 13 08:16:55.001000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.003061 systemd[1]: Reached target initrd-root-device.target. Feb 13 08:16:55.082701 kernel: audit: type=1130 audit(1707812215.001:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.074654 systemd[1]: Reached target local-fs-pre.target. Feb 13 08:16:55.090669 systemd[1]: Reached target local-fs.target. Feb 13 08:16:55.105639 systemd[1]: Reached target sysinit.target. Feb 13 08:16:55.120689 systemd[1]: Reached target basic.target. Feb 13 08:16:55.135800 systemd[1]: Starting systemd-fsck-root.service... Feb 13 08:16:55.158751 systemd-fsck[942]: ROOT: clean, 602/553520 files, 56013/553472 blocks Feb 13 08:16:55.170231 systemd[1]: Finished systemd-fsck-root.service. Feb 13 08:16:55.258787 kernel: audit: type=1130 audit(1707812215.177:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.258803 kernel: EXT4-fs (sda9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Feb 13 08:16:55.177000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.179243 systemd[1]: Mounting sysroot.mount... Feb 13 08:16:55.266093 systemd[1]: Mounted sysroot.mount. Feb 13 08:16:55.279739 systemd[1]: Reached target initrd-root-fs.target. Feb 13 08:16:55.295280 systemd[1]: Mounting sysroot-usr.mount... Feb 13 08:16:55.310305 systemd[1]: Starting flatcar-metadata-hostname.service... Feb 13 08:16:55.325092 systemd[1]: Starting flatcar-static-network.service... Feb 13 08:16:55.340626 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 08:16:55.340666 systemd[1]: Reached target ignition-diskful.target. Feb 13 08:16:55.358683 systemd[1]: Mounted sysroot-usr.mount. Feb 13 08:16:55.381937 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 08:16:55.394825 systemd[1]: Starting initrd-setup-root.service... Feb 13 08:16:55.520044 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (955) Feb 13 08:16:55.520060 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 08:16:55.520073 kernel: BTRFS info (device sda6): using free space tree Feb 13 08:16:55.520080 kernel: BTRFS info (device sda6): has skinny extents Feb 13 08:16:55.520088 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 08:16:55.453812 systemd[1]: Finished initrd-setup-root.service. Feb 13 08:16:55.581827 kernel: audit: type=1130 audit(1707812215.527:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.527000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.581863 coreos-metadata[950]: Feb 13 08:16:55.456 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:16:55.581863 coreos-metadata[950]: Feb 13 08:16:55.478 INFO Fetch successful Feb 13 08:16:55.767119 kernel: audit: type=1130 audit(1707812215.590:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.767131 kernel: audit: type=1130 audit(1707812215.653:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.767138 kernel: audit: type=1131 audit(1707812215.653:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.590000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.653000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.653000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.767194 coreos-metadata[949]: Feb 13 08:16:55.456 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:16:55.767194 coreos-metadata[949]: Feb 13 08:16:55.478 INFO Fetch successful Feb 13 08:16:55.767194 coreos-metadata[949]: Feb 13 08:16:55.495 INFO wrote hostname ci-3510.3.2-a-56b02fc11a to /sysroot/etc/hostname Feb 13 08:16:55.815511 initrd-setup-root[960]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 08:16:55.529807 systemd[1]: Finished flatcar-metadata-hostname.service. Feb 13 08:16:55.841000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.871684 initrd-setup-root[968]: cut: /sysroot/etc/group: No such file or directory Feb 13 08:16:55.909659 kernel: audit: type=1130 audit(1707812215.841:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.591770 systemd[1]: flatcar-static-network.service: Deactivated successfully. Feb 13 08:16:55.920715 initrd-setup-root[976]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 08:16:55.591810 systemd[1]: Finished flatcar-static-network.service. Feb 13 08:16:55.939678 initrd-setup-root[984]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 08:16:55.654741 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 08:16:55.957679 ignition[1025]: INFO : Ignition 2.14.0 Feb 13 08:16:55.957679 ignition[1025]: INFO : Stage: mount Feb 13 08:16:55.957679 ignition[1025]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:16:55.957679 ignition[1025]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:16:55.957679 ignition[1025]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:16:55.957679 ignition[1025]: INFO : mount: mount passed Feb 13 08:16:55.957679 ignition[1025]: INFO : POST message to Packet Timeline Feb 13 08:16:55.957679 ignition[1025]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:16:55.957679 ignition[1025]: INFO : GET result: OK Feb 13 08:16:55.776121 systemd[1]: Starting ignition-mount.service... Feb 13 08:16:55.803128 systemd[1]: Starting sysroot-boot.service... Feb 13 08:16:56.061754 ignition[1025]: INFO : Ignition finished successfully Feb 13 08:16:56.068000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.824443 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Feb 13 08:16:56.145558 kernel: audit: type=1130 audit(1707812216.068:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:55.824489 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Feb 13 08:16:56.278562 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1038) Feb 13 08:16:56.278576 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 08:16:56.278584 kernel: BTRFS info (device sda6): using free space tree Feb 13 08:16:56.278591 kernel: BTRFS info (device sda6): has skinny extents Feb 13 08:16:56.278597 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 08:16:55.825143 systemd[1]: Finished sysroot-boot.service. Feb 13 08:16:56.055866 systemd[1]: Finished ignition-mount.service. Feb 13 08:16:56.071420 systemd[1]: Starting ignition-files.service... Feb 13 08:16:56.309596 ignition[1058]: INFO : Ignition 2.14.0 Feb 13 08:16:56.309596 ignition[1058]: INFO : Stage: files Feb 13 08:16:56.309596 ignition[1058]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:16:56.309596 ignition[1058]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:16:56.309596 ignition[1058]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:16:56.309596 ignition[1058]: DEBUG : files: compiled without relabeling support, skipping Feb 13 08:16:56.309596 ignition[1058]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 08:16:56.309596 ignition[1058]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 08:16:56.309596 ignition[1058]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 08:16:56.309596 ignition[1058]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 08:16:56.309596 ignition[1058]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 08:16:56.309596 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 08:16:56.309596 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 08:16:56.137246 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 08:16:56.480839 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 08:16:56.480839 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 08:16:56.480839 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Feb 13 08:16:56.480839 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Feb 13 08:16:56.480839 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 13 08:16:56.480839 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET https://github.com/containernetworking/plugins/releases/download/v1.1.1/cni-plugins-linux-amd64-v1.1.1.tgz: attempt #1 Feb 13 08:16:56.272822 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 08:16:56.297473 unknown[1058]: wrote ssh authorized keys file for user: core Feb 13 08:16:56.834552 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET result: OK Feb 13 08:16:56.910098 ignition[1058]: DEBUG : files: createFilesystemsFiles: createFiles: op(5): file matches expected sum of: 4d0ed0abb5951b9cf83cba938ef84bdc5b681f4ac869da8143974f6a53a3ff30c666389fa462b9d14d30af09bf03f6cdf77598c572f8fb3ea00cecdda467a48d Feb 13 08:16:56.936708 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 13 08:16:56.936708 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 13 08:16:56.936708 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/kubernetes-sigs/cri-tools/releases/download/v1.26.0/crictl-v1.26.0-linux-amd64.tar.gz: attempt #1 Feb 13 08:16:57.315848 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 08:16:57.365229 ignition[1058]: DEBUG : files: createFilesystemsFiles: createFiles: op(6): file matches expected sum of: a3a2c02a90b008686c20babaf272e703924db2a3e2a0d4e2a7c81d994cbc68c47458a4a354ecc243af095b390815c7f203348b9749351ae817bd52a522300449 Feb 13 08:16:57.365229 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 13 08:16:57.407685 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/opt/bin/kubectl" Feb 13 08:16:57.407685 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubectl: attempt #1 Feb 13 08:16:57.439564 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET result: OK Feb 13 08:16:57.553692 ignition[1058]: DEBUG : files: createFilesystemsFiles: createFiles: op(7): file matches expected sum of: 97840854134909d75a1a2563628cc4ba632067369ce7fc8a8a1e90a387d32dd7bfd73f4f5b5a82ef842088e7470692951eb7fc869c5f297dd740f855672ee628 Feb 13 08:16:57.579697 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/opt/bin/kubectl" Feb 13 08:16:57.579697 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/opt/bin/kubelet" Feb 13 08:16:57.579697 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubelet: attempt #1 Feb 13 08:16:57.628580 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET result: OK Feb 13 08:16:58.057961 ignition[1058]: DEBUG : files: createFilesystemsFiles: createFiles: op(8): file matches expected sum of: 40daf2a9b9e666c14b10e627da931bd79978628b1f23ef6429c1cb4fcba261f86ccff440c0dbb0070ee760fe55772b4fd279c4582dfbb17fa30bc94b7f00126b Feb 13 08:16:58.057961 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/opt/bin/kubelet" Feb 13 08:16:58.100682 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/opt/bin/kubeadm" Feb 13 08:16:58.100682 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(9): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubeadm: attempt #1 Feb 13 08:16:58.132544 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(9): GET result: OK Feb 13 08:16:58.242363 ignition[1058]: DEBUG : files: createFilesystemsFiles: createFiles: op(9): file matches expected sum of: 1c324cd645a7bf93d19d24c87498d9a17878eb1cc927e2680200ffeab2f85051ddec47d85b79b8e774042dc6726299ad3d7caf52c060701f00deba30dc33f660 Feb 13 08:16:58.276580 kernel: BTRFS info: devid 1 device path /dev/sda6 changed to /dev/disk/by-label/OEM scanned by ignition (1062) Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/opt/bin/kubeadm" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/etc/docker/daemon.json" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/etc/docker/daemon.json" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/home/core/install.sh" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(d): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(d): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(e): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(e): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): [started] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): oem config not found in "/usr/share/oem", looking on oem partition Feb 13 08:16:58.276594 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(11): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3216674523" Feb 13 08:16:58.588697 kernel: audit: type=1130 audit(1707812218.502:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.493244 systemd[1]: Finished ignition-files.service. Feb 13 08:16:58.603639 ignition[1058]: CRITICAL : files: createFilesystemsFiles: createFiles: op(10): op(11): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3216674523": device or resource busy Feb 13 08:16:58.603639 ignition[1058]: ERROR : files: createFilesystemsFiles: createFiles: op(10): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem3216674523", trying btrfs: device or resource busy Feb 13 08:16:58.603639 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(12): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3216674523" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(12): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3216674523" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(13): [started] unmounting "/mnt/oem3216674523" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): op(13): [finished] unmounting "/mnt/oem3216674523" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: createFilesystemsFiles: createFiles: op(10): [finished] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(14): [started] processing unit "coreos-metadata-sshkeys@.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(14): [finished] processing unit "coreos-metadata-sshkeys@.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(15): [started] processing unit "packet-phone-home.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(15): [finished] processing unit "packet-phone-home.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(16): [started] processing unit "containerd.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(16): op(17): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(16): op(17): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(16): [finished] processing unit "containerd.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(18): [started] processing unit "prepare-cni-plugins.service" Feb 13 08:16:58.603639 ignition[1058]: INFO : files: op(18): op(19): [started] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 08:16:58.612000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.640000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.510229 systemd[1]: Starting initrd-setup-root-after-ignition.service... Feb 13 08:16:58.984000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(18): op(19): [finished] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(18): [finished] processing unit "prepare-cni-plugins.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1a): [started] processing unit "prepare-critools.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1a): op(1b): [started] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1a): op(1b): [finished] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1a): [finished] processing unit "prepare-critools.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1c): [started] processing unit "prepare-helm.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1c): op(1d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1c): op(1d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1c): [finished] processing unit "prepare-helm.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1e): [started] setting preset to enabled for "prepare-helm.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1e): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1f): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(1f): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(20): [started] setting preset to enabled for "packet-phone-home.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(20): [finished] setting preset to enabled for "packet-phone-home.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(21): [started] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(21): [finished] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 08:16:58.995943 ignition[1058]: INFO : files: op(22): [started] setting preset to enabled for "prepare-critools.service" Feb 13 08:16:59.252000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.296000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.319000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.418136 initrd-setup-root-after-ignition[1091]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 08:16:58.571697 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Feb 13 08:16:59.463797 ignition[1058]: INFO : files: op(22): [finished] setting preset to enabled for "prepare-critools.service" Feb 13 08:16:59.463797 ignition[1058]: INFO : files: createResultFile: createFiles: op(23): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 08:16:59.463797 ignition[1058]: INFO : files: createResultFile: createFiles: op(23): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 08:16:59.463797 ignition[1058]: INFO : files: files passed Feb 13 08:16:59.463797 ignition[1058]: INFO : POST message to Packet Timeline Feb 13 08:16:59.463797 ignition[1058]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:16:59.463797 ignition[1058]: INFO : GET result: OK Feb 13 08:16:59.463797 ignition[1058]: INFO : Ignition finished successfully Feb 13 08:16:59.472000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.494000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.519000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.556000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.572007 systemd[1]: Starting ignition-quench.service... Feb 13 08:16:59.620000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.595749 systemd[1]: Finished initrd-setup-root-after-ignition.service. Feb 13 08:16:59.637000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.613859 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 08:16:58.613928 systemd[1]: Finished ignition-quench.service. Feb 13 08:16:59.744093 kernel: kauditd_printk_skb: 16 callbacks suppressed Feb 13 08:16:59.744111 kernel: audit: type=1131 audit(1707812219.662:57): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.662000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.641916 systemd[1]: Reached target ignition-complete.target. Feb 13 08:16:59.811562 kernel: audit: type=1131 audit(1707812219.752:58): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.752000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.672636 systemd[1]: Starting initrd-parse-etc.service... Feb 13 08:16:59.902804 kernel: audit: type=1131 audit(1707812219.817:59): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.902816 kernel: audit: type=1334 audit(1707812219.817:60): prog-id=6 op=UNLOAD Feb 13 08:16:59.817000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.817000 audit: BPF prog-id=6 op=UNLOAD Feb 13 08:16:59.902854 ignition[1106]: INFO : Ignition 2.14.0 Feb 13 08:16:59.902854 ignition[1106]: INFO : Stage: umount Feb 13 08:16:59.902854 ignition[1106]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:16:59.902854 ignition[1106]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:16:59.902854 ignition[1106]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:16:59.902854 ignition[1106]: INFO : umount: umount passed Feb 13 08:16:59.902854 ignition[1106]: INFO : POST message to Packet Timeline Feb 13 08:16:59.902854 ignition[1106]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:16:59.902854 ignition[1106]: INFO : GET result: OK Feb 13 08:16:59.902854 ignition[1106]: INFO : Ignition finished successfully Feb 13 08:17:00.343712 kernel: audit: type=1131 audit(1707812219.949:61): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.343729 kernel: audit: type=1131 audit(1707812220.015:62): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.343737 kernel: audit: type=1131 audit(1707812220.080:63): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.343744 kernel: audit: type=1130 audit(1707812220.146:64): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.343752 kernel: audit: type=1131 audit(1707812220.146:65): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.343760 kernel: audit: type=1131 audit(1707812220.270:66): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.949000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.015000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.080000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.146000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.270000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.328000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.719807 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 08:17:00.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.719896 systemd[1]: Finished initrd-parse-etc.service. Feb 13 08:17:00.374000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.734812 systemd[1]: Reached target initrd-fs.target. Feb 13 08:17:00.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.759812 systemd[1]: Reached target initrd.target. Feb 13 08:16:58.784959 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Feb 13 08:17:00.418000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.787135 systemd[1]: Starting dracut-pre-pivot.service... Feb 13 08:16:58.816788 systemd[1]: Finished dracut-pre-pivot.service. Feb 13 08:16:58.848883 systemd[1]: Starting initrd-cleanup.service... Feb 13 08:17:00.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.862670 systemd[1]: Stopped target network.target. Feb 13 08:17:00.485000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.884833 systemd[1]: Stopped target nss-lookup.target. Feb 13 08:17:00.501000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.918996 systemd[1]: Stopped target remote-cryptsetup.target. Feb 13 08:16:58.939118 systemd[1]: Stopped target timers.target. Feb 13 08:17:00.533000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.959057 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 08:17:00.548000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:00.548000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:58.959426 systemd[1]: Stopped dracut-pre-pivot.service. Feb 13 08:16:58.986375 systemd[1]: Stopped target initrd.target. Feb 13 08:16:59.003195 systemd[1]: Stopped target basic.target. Feb 13 08:16:59.029178 systemd[1]: Stopped target ignition-complete.target. Feb 13 08:16:59.051064 systemd[1]: Stopped target ignition-diskful.target. Feb 13 08:16:59.072068 systemd[1]: Stopped target initrd-root-device.target. Feb 13 08:17:00.601000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:16:59.097080 systemd[1]: Stopped target remote-fs.target. Feb 13 08:16:59.122061 systemd[1]: Stopped target remote-fs-pre.target. Feb 13 08:16:59.143096 systemd[1]: Stopped target sysinit.target. Feb 13 08:17:00.640000 audit: BPF prog-id=5 op=UNLOAD Feb 13 08:17:00.640000 audit: BPF prog-id=4 op=UNLOAD Feb 13 08:17:00.640000 audit: BPF prog-id=3 op=UNLOAD Feb 13 08:16:59.163082 systemd[1]: Stopped target local-fs.target. Feb 13 08:17:00.647000 audit: BPF prog-id=8 op=UNLOAD Feb 13 08:17:00.647000 audit: BPF prog-id=7 op=UNLOAD Feb 13 08:16:59.188044 systemd[1]: Stopped target local-fs-pre.target. Feb 13 08:16:59.213057 systemd[1]: Stopped target swap.target. Feb 13 08:16:59.231965 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 08:16:59.232335 systemd[1]: Stopped dracut-pre-mount.service. Feb 13 08:17:00.708525 systemd-journald[267]: Received SIGTERM from PID 1 (systemd). Feb 13 08:17:00.708572 iscsid[908]: iscsid shutting down. Feb 13 08:16:59.254252 systemd[1]: Stopped target cryptsetup.target. Feb 13 08:16:59.275945 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 08:16:59.276280 systemd[1]: Stopped dracut-initqueue.service. Feb 13 08:16:59.298222 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 08:16:59.298602 systemd[1]: Stopped ignition-fetch-offline.service. Feb 13 08:16:59.321235 systemd[1]: Stopped target paths.target. Feb 13 08:16:59.340956 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 08:16:59.345741 systemd[1]: Stopped systemd-ask-password-console.path. Feb 13 08:16:59.363032 systemd[1]: Stopped target slices.target. Feb 13 08:16:59.381994 systemd[1]: Stopped target sockets.target. Feb 13 08:16:59.405040 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 08:16:59.405287 systemd[1]: Closed iscsid.socket. Feb 13 08:16:59.425021 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 08:16:59.425248 systemd[1]: Closed iscsiuio.socket. Feb 13 08:16:59.446103 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 08:16:59.446460 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Feb 13 08:16:59.474166 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 08:16:59.474533 systemd[1]: Stopped ignition-files.service. Feb 13 08:16:59.496161 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 13 08:16:59.496538 systemd[1]: Stopped flatcar-metadata-hostname.service. Feb 13 08:16:59.523185 systemd[1]: Stopping ignition-mount.service... Feb 13 08:16:59.542721 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 08:16:59.543088 systemd[1]: Stopped kmod-static-nodes.service. Feb 13 08:16:59.559801 systemd[1]: Stopping sysroot-boot.service... Feb 13 08:16:59.574190 systemd[1]: Stopping systemd-networkd.service... Feb 13 08:16:59.584608 systemd-networkd[878]: enp1s0f0np0: DHCPv6 lease lost Feb 13 08:16:59.590948 systemd[1]: Stopping systemd-resolved.service... Feb 13 08:16:59.591629 systemd-networkd[878]: enp1s0f1np1: DHCPv6 lease lost Feb 13 08:17:00.708000 audit: BPF prog-id=9 op=UNLOAD Feb 13 08:16:59.605598 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 08:16:59.605859 systemd[1]: Stopped systemd-udev-trigger.service. Feb 13 08:16:59.621925 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 08:16:59.622137 systemd[1]: Stopped dracut-pre-trigger.service. Feb 13 08:16:59.646154 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 08:16:59.648409 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 08:16:59.648636 systemd[1]: Stopped systemd-resolved.service. Feb 13 08:16:59.665529 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 08:16:59.665762 systemd[1]: Stopped systemd-networkd.service. Feb 13 08:16:59.753980 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 08:16:59.754019 systemd[1]: Stopped sysroot-boot.service. Feb 13 08:16:59.818888 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 08:16:59.818936 systemd[1]: Closed systemd-networkd.socket. Feb 13 08:16:59.912050 systemd[1]: Stopping network-cleanup.service... Feb 13 08:16:59.931624 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 08:16:59.931656 systemd[1]: Stopped parse-ip-for-networkd.service. Feb 13 08:16:59.950754 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 08:16:59.950808 systemd[1]: Stopped systemd-sysctl.service. Feb 13 08:17:00.016720 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 08:17:00.016743 systemd[1]: Stopped systemd-modules-load.service. Feb 13 08:17:00.082361 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 08:17:00.082675 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 08:17:00.082716 systemd[1]: Finished initrd-cleanup.service. Feb 13 08:17:00.147750 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 08:17:00.147787 systemd[1]: Stopped ignition-mount.service. Feb 13 08:17:00.272426 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 08:17:00.272448 systemd[1]: Stopped ignition-disks.service. Feb 13 08:17:00.330396 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 08:17:00.330417 systemd[1]: Stopped ignition-kargs.service. Feb 13 08:17:00.357692 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 08:17:00.357714 systemd[1]: Stopped ignition-setup.service. Feb 13 08:17:00.375732 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 08:17:00.375775 systemd[1]: Stopped initrd-setup-root.service. Feb 13 08:17:00.390908 systemd[1]: Stopping systemd-udevd.service... Feb 13 08:17:00.404945 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 08:17:00.405078 systemd[1]: Stopped systemd-udevd.service. Feb 13 08:17:00.419947 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 08:17:00.420026 systemd[1]: Closed systemd-udevd-control.socket. Feb 13 08:17:00.439772 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 08:17:00.439858 systemd[1]: Closed systemd-udevd-kernel.socket. Feb 13 08:17:00.456697 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 08:17:00.456816 systemd[1]: Stopped dracut-pre-udev.service. Feb 13 08:17:00.471789 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 08:17:00.471905 systemd[1]: Stopped dracut-cmdline.service. Feb 13 08:17:00.486798 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 08:17:00.486914 systemd[1]: Stopped dracut-cmdline-ask.service. Feb 13 08:17:00.504377 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Feb 13 08:17:00.517630 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 08:17:00.517769 systemd[1]: Stopped systemd-vconsole-setup.service. Feb 13 08:17:00.535557 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 08:17:00.535758 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Feb 13 08:17:00.591504 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 08:17:00.591728 systemd[1]: Stopped network-cleanup.service. Feb 13 08:17:00.603086 systemd[1]: Reached target initrd-switch-root.target. Feb 13 08:17:00.620485 systemd[1]: Starting initrd-switch-root.service... Feb 13 08:17:00.638082 systemd[1]: Switching root. Feb 13 08:17:00.711599 systemd-journald[267]: Journal stopped Feb 13 08:17:04.282184 kernel: SELinux: Class mctp_socket not defined in policy. Feb 13 08:17:04.282198 kernel: SELinux: Class anon_inode not defined in policy. Feb 13 08:17:04.282207 kernel: SELinux: the above unknown classes and permissions will be allowed Feb 13 08:17:04.282212 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 08:17:04.282218 kernel: SELinux: policy capability open_perms=1 Feb 13 08:17:04.282223 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 08:17:04.282229 kernel: SELinux: policy capability always_check_network=0 Feb 13 08:17:04.282234 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 08:17:04.282239 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 08:17:04.282246 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 08:17:04.282251 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 08:17:04.282257 systemd[1]: Successfully loaded SELinux policy in 322.661ms. Feb 13 08:17:04.282264 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 5.856ms. Feb 13 08:17:04.282270 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 08:17:04.282278 systemd[1]: Detected architecture x86-64. Feb 13 08:17:04.282284 systemd[1]: Detected first boot. Feb 13 08:17:04.282290 systemd[1]: Hostname set to . Feb 13 08:17:04.282296 systemd[1]: Initializing machine ID from random generator. Feb 13 08:17:04.282302 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Feb 13 08:17:04.282307 systemd[1]: Populated /etc with preset unit settings. Feb 13 08:17:04.282313 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:17:04.282321 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:17:04.282328 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:17:04.282334 systemd[1]: Queued start job for default target multi-user.target. Feb 13 08:17:04.282340 systemd[1]: Created slice system-addon\x2dconfig.slice. Feb 13 08:17:04.282346 systemd[1]: Created slice system-addon\x2drun.slice. Feb 13 08:17:04.282353 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Feb 13 08:17:04.282360 systemd[1]: Created slice system-getty.slice. Feb 13 08:17:04.282366 systemd[1]: Created slice system-modprobe.slice. Feb 13 08:17:04.282372 systemd[1]: Created slice system-serial\x2dgetty.slice. Feb 13 08:17:04.282378 systemd[1]: Created slice system-system\x2dcloudinit.slice. Feb 13 08:17:04.282384 systemd[1]: Created slice system-systemd\x2dfsck.slice. Feb 13 08:17:04.282390 systemd[1]: Created slice user.slice. Feb 13 08:17:04.282396 systemd[1]: Started systemd-ask-password-console.path. Feb 13 08:17:04.282402 systemd[1]: Started systemd-ask-password-wall.path. Feb 13 08:17:04.282408 systemd[1]: Set up automount boot.automount. Feb 13 08:17:04.282415 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Feb 13 08:17:04.282421 systemd[1]: Reached target integritysetup.target. Feb 13 08:17:04.282427 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 08:17:04.282433 systemd[1]: Reached target remote-fs.target. Feb 13 08:17:04.282441 systemd[1]: Reached target slices.target. Feb 13 08:17:04.282447 systemd[1]: Reached target swap.target. Feb 13 08:17:04.282453 systemd[1]: Reached target torcx.target. Feb 13 08:17:04.282459 systemd[1]: Reached target veritysetup.target. Feb 13 08:17:04.282469 systemd[1]: Listening on systemd-coredump.socket. Feb 13 08:17:04.282476 systemd[1]: Listening on systemd-initctl.socket. Feb 13 08:17:04.282483 systemd[1]: Listening on systemd-journald-audit.socket. Feb 13 08:17:04.282515 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 13 08:17:04.282538 systemd[1]: Listening on systemd-journald.socket. Feb 13 08:17:04.282544 systemd[1]: Listening on systemd-networkd.socket. Feb 13 08:17:04.282566 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 08:17:04.282573 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 08:17:04.282580 systemd[1]: Listening on systemd-userdbd.socket. Feb 13 08:17:04.282587 systemd[1]: Mounting dev-hugepages.mount... Feb 13 08:17:04.282593 systemd[1]: Mounting dev-mqueue.mount... Feb 13 08:17:04.282600 systemd[1]: Mounting media.mount... Feb 13 08:17:04.282606 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 08:17:04.282613 systemd[1]: Mounting sys-kernel-debug.mount... Feb 13 08:17:04.282620 systemd[1]: Mounting sys-kernel-tracing.mount... Feb 13 08:17:04.282626 systemd[1]: Mounting tmp.mount... Feb 13 08:17:04.282633 systemd[1]: Starting flatcar-tmpfiles.service... Feb 13 08:17:04.282639 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Feb 13 08:17:04.282645 systemd[1]: Starting kmod-static-nodes.service... Feb 13 08:17:04.282652 systemd[1]: Starting modprobe@configfs.service... Feb 13 08:17:04.282658 systemd[1]: Starting modprobe@dm_mod.service... Feb 13 08:17:04.282664 systemd[1]: Starting modprobe@drm.service... Feb 13 08:17:04.282671 systemd[1]: Starting modprobe@efi_pstore.service... Feb 13 08:17:04.282678 systemd[1]: Starting modprobe@fuse.service... Feb 13 08:17:04.282685 kernel: fuse: init (API version 7.34) Feb 13 08:17:04.282691 systemd[1]: Starting modprobe@loop.service... Feb 13 08:17:04.282697 kernel: loop: module loaded Feb 13 08:17:04.282703 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 08:17:04.282710 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Feb 13 08:17:04.282716 systemd[1]: (This warning is only shown for the first unit using IP firewalling.) Feb 13 08:17:04.282722 systemd[1]: Starting systemd-journald.service... Feb 13 08:17:04.282730 systemd[1]: Starting systemd-modules-load.service... Feb 13 08:17:04.282739 systemd-journald[1302]: Journal started Feb 13 08:17:04.282763 systemd-journald[1302]: Runtime Journal (/run/log/journal/a910983438cd4c499390a8be912474dd) is 8.0M, max 640.1M, 632.1M free. Feb 13 08:17:03.666000 audit[1]: AVC avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 08:17:03.666000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Feb 13 08:17:04.278000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 13 08:17:04.278000 audit[1302]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffcc78106d0 a2=4000 a3=7ffcc781076c items=0 ppid=1 pid=1302 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:04.278000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Feb 13 08:17:04.315645 systemd[1]: Starting systemd-network-generator.service... Feb 13 08:17:04.337654 systemd[1]: Starting systemd-remount-fs.service... Feb 13 08:17:04.358522 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 08:17:04.393518 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 08:17:04.408658 systemd[1]: Started systemd-journald.service. Feb 13 08:17:04.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.418213 systemd[1]: Mounted dev-hugepages.mount. Feb 13 08:17:04.425724 systemd[1]: Mounted dev-mqueue.mount. Feb 13 08:17:04.433717 systemd[1]: Mounted media.mount. Feb 13 08:17:04.440707 systemd[1]: Mounted sys-kernel-debug.mount. Feb 13 08:17:04.448690 systemd[1]: Mounted sys-kernel-tracing.mount. Feb 13 08:17:04.456674 systemd[1]: Mounted tmp.mount. Feb 13 08:17:04.463829 systemd[1]: Finished flatcar-tmpfiles.service. Feb 13 08:17:04.471000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.472930 systemd[1]: Finished kmod-static-nodes.service. Feb 13 08:17:04.480000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.482065 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 08:17:04.482239 systemd[1]: Finished modprobe@configfs.service. Feb 13 08:17:04.489000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.489000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.490989 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 08:17:04.491200 systemd[1]: Finished modprobe@dm_mod.service. Feb 13 08:17:04.498000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.498000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.500072 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 08:17:04.500325 systemd[1]: Finished modprobe@drm.service. Feb 13 08:17:04.507000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.507000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.509333 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 08:17:04.509725 systemd[1]: Finished modprobe@efi_pstore.service. Feb 13 08:17:04.516000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.516000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.518311 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 08:17:04.518800 systemd[1]: Finished modprobe@fuse.service. Feb 13 08:17:04.526000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.526000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.528365 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 08:17:04.528753 systemd[1]: Finished modprobe@loop.service. Feb 13 08:17:04.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.535000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.537422 systemd[1]: Finished systemd-modules-load.service. Feb 13 08:17:04.544000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.546512 systemd[1]: Finished systemd-network-generator.service. Feb 13 08:17:04.553000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.555407 systemd[1]: Finished systemd-remount-fs.service. Feb 13 08:17:04.562000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.564408 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 08:17:04.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.573650 systemd[1]: Reached target network-pre.target. Feb 13 08:17:04.584227 systemd[1]: Mounting sys-fs-fuse-connections.mount... Feb 13 08:17:04.595194 systemd[1]: Mounting sys-kernel-config.mount... Feb 13 08:17:04.602712 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 08:17:04.606261 systemd[1]: Starting systemd-hwdb-update.service... Feb 13 08:17:04.616284 systemd[1]: Starting systemd-journal-flush.service... Feb 13 08:17:04.624751 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 08:17:04.627366 systemd[1]: Starting systemd-random-seed.service... Feb 13 08:17:04.629390 systemd-journald[1302]: Time spent on flushing to /var/log/journal/a910983438cd4c499390a8be912474dd is 14.322ms for 1540 entries. Feb 13 08:17:04.629390 systemd-journald[1302]: System Journal (/var/log/journal/a910983438cd4c499390a8be912474dd) is 8.0M, max 195.6M, 187.6M free. Feb 13 08:17:04.671048 systemd-journald[1302]: Received client request to flush runtime journal. Feb 13 08:17:04.642620 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Feb 13 08:17:04.643139 systemd[1]: Starting systemd-sysctl.service... Feb 13 08:17:04.654074 systemd[1]: Starting systemd-sysusers.service... Feb 13 08:17:04.661202 systemd[1]: Starting systemd-udev-settle.service... Feb 13 08:17:04.668685 systemd[1]: Mounted sys-fs-fuse-connections.mount. Feb 13 08:17:04.677712 systemd[1]: Mounted sys-kernel-config.mount. Feb 13 08:17:04.686772 systemd[1]: Finished systemd-journal-flush.service. Feb 13 08:17:04.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.694758 systemd[1]: Finished systemd-random-seed.service. Feb 13 08:17:04.708830 kernel: kauditd_printk_skb: 54 callbacks suppressed Feb 13 08:17:04.708892 kernel: audit: type=1130 audit(1707812224.693:112): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.758756 systemd[1]: Finished systemd-sysctl.service. Feb 13 08:17:04.801471 kernel: audit: type=1130 audit(1707812224.757:113): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.808000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.809744 systemd[1]: Finished systemd-sysusers.service. Feb 13 08:17:04.854527 kernel: audit: type=1130 audit(1707812224.808:114): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.862708 systemd[1]: Reached target first-boot-complete.target. Feb 13 08:17:04.907593 kernel: audit: type=1130 audit(1707812224.860:115): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.916491 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 13 08:17:04.924918 udevadm[1328]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 08:17:04.935598 systemd[1]: Finished systemd-hwdb-update.service. Feb 13 08:17:04.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.944968 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 13 08:17:04.992661 kernel: audit: type=1130 audit(1707812224.942:116): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:04.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:05.001357 systemd[1]: Starting systemd-udevd.service... Feb 13 08:17:05.064524 kernel: audit: type=1130 audit(1707812224.999:117): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:05.074532 systemd-udevd[1337]: Using default interface naming scheme 'v252'. Feb 13 08:17:05.093936 systemd[1]: Started systemd-udevd.service. Feb 13 08:17:05.100000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:05.104874 systemd[1]: Found device dev-ttyS1.device. Feb 13 08:17:05.152477 kernel: audit: type=1130 audit(1707812225.100:118): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:05.152538 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Feb 13 08:17:05.202443 kernel: BTRFS info: devid 1 device path /dev/disk/by-label/OEM changed to /dev/sda6 scanned by (udev-worker) (1385) Feb 13 08:17:05.202487 kernel: ACPI: button: Sleep Button [SLPB] Feb 13 08:17:05.207350 systemd[1]: Starting systemd-networkd.service... Feb 13 08:17:05.238303 systemd[1]: dev-disk-by\x2dlabel-OEM.device was skipped because of an unmet condition check (ConditionPathExists=!/usr/.noupdate). Feb 13 08:17:05.240712 systemd[1]: Starting systemd-userdbd.service... Feb 13 08:17:05.245644 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 08:17:05.246474 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 08:17:05.268612 kernel: ACPI: button: Power Button [PWRF] Feb 13 08:17:05.308480 kernel: IPMI message handler: version 39.2 Feb 13 08:17:05.328478 kernel: ipmi device interface Feb 13 08:17:05.157000 audit[1390]: AVC avc: denied { confidentiality } for pid=1390 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 08:17:05.405478 kernel: audit: type=1400 audit(1707812225.157:119): avc: denied { confidentiality } for pid=1390 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 08:17:05.157000 audit[1390]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=7fa74c539010 a1=df92c a2=7fa74e29fbc5 a3=5 items=312 ppid=1337 pid=1390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:05.157000 audit: CWD cwd="/" Feb 13 08:17:05.524465 kernel: audit: type=1300 audit(1707812225.157:119): arch=c000003e syscall=175 success=yes exit=0 a0=7fa74c539010 a1=df92c a2=7fa74e29fbc5 a3=5 items=312 ppid=1337 pid=1390 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:05.524516 kernel: audit: type=1307 audit(1707812225.157:119): cwd="/" Feb 13 08:17:05.157000 audit: PATH item=0 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=1 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=2 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=3 name=(null) inode=24702 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=4 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=5 name=(null) inode=24703 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=6 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=7 name=(null) inode=24704 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=8 name=(null) inode=24704 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=9 name=(null) inode=24705 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=10 name=(null) inode=24704 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=11 name=(null) inode=24706 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=12 name=(null) inode=24704 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=13 name=(null) inode=24707 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=14 name=(null) inode=24704 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=15 name=(null) inode=24708 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=16 name=(null) inode=24704 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=17 name=(null) inode=24709 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=18 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=19 name=(null) inode=24710 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=20 name=(null) inode=24710 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=21 name=(null) inode=24711 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=22 name=(null) inode=24710 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=23 name=(null) inode=24712 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=24 name=(null) inode=24710 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=25 name=(null) inode=24713 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=26 name=(null) inode=24710 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=27 name=(null) inode=24714 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=28 name=(null) inode=24710 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=29 name=(null) inode=24715 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=30 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=31 name=(null) inode=24716 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=32 name=(null) inode=24716 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=33 name=(null) inode=24717 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=34 name=(null) inode=24716 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=35 name=(null) inode=24718 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=36 name=(null) inode=24716 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=37 name=(null) inode=24719 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=38 name=(null) inode=24716 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=39 name=(null) inode=24720 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=40 name=(null) inode=24716 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=41 name=(null) inode=24721 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=42 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=43 name=(null) inode=24722 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=44 name=(null) inode=24722 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=45 name=(null) inode=24723 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=46 name=(null) inode=24722 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=47 name=(null) inode=24724 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=48 name=(null) inode=24722 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=49 name=(null) inode=24725 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=50 name=(null) inode=24722 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=51 name=(null) inode=24726 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=52 name=(null) inode=24722 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=53 name=(null) inode=24727 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=54 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=55 name=(null) inode=24728 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=56 name=(null) inode=24728 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=57 name=(null) inode=24729 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=58 name=(null) inode=24728 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=59 name=(null) inode=24730 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=60 name=(null) inode=24728 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=61 name=(null) inode=24731 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=62 name=(null) inode=24728 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=63 name=(null) inode=24732 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=64 name=(null) inode=24728 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=65 name=(null) inode=24733 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=66 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=67 name=(null) inode=24734 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=68 name=(null) inode=24734 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=69 name=(null) inode=24735 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=70 name=(null) inode=24734 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=71 name=(null) inode=24736 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=72 name=(null) inode=24734 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=73 name=(null) inode=24737 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=74 name=(null) inode=24734 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=75 name=(null) inode=24738 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=76 name=(null) inode=24734 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=77 name=(null) inode=24739 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=78 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=79 name=(null) inode=24740 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=80 name=(null) inode=24740 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=81 name=(null) inode=24741 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=82 name=(null) inode=24740 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=83 name=(null) inode=24742 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=84 name=(null) inode=24740 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=85 name=(null) inode=24743 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=86 name=(null) inode=24740 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=87 name=(null) inode=24744 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=88 name=(null) inode=24740 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=89 name=(null) inode=24745 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=90 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=91 name=(null) inode=24746 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=92 name=(null) inode=24746 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=93 name=(null) inode=24747 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=94 name=(null) inode=24746 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=95 name=(null) inode=24748 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=96 name=(null) inode=24746 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=97 name=(null) inode=24749 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=98 name=(null) inode=24746 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=99 name=(null) inode=24750 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=100 name=(null) inode=24746 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=101 name=(null) inode=24751 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=102 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=103 name=(null) inode=24752 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=104 name=(null) inode=24752 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=105 name=(null) inode=24753 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=106 name=(null) inode=24752 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=107 name=(null) inode=24754 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=108 name=(null) inode=24752 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=109 name=(null) inode=24755 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=110 name=(null) inode=24752 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=111 name=(null) inode=24756 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=112 name=(null) inode=24752 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=113 name=(null) inode=24757 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=114 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=115 name=(null) inode=24758 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=116 name=(null) inode=24758 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=117 name=(null) inode=24759 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=118 name=(null) inode=24758 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=119 name=(null) inode=24760 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=120 name=(null) inode=24758 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=121 name=(null) inode=24761 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=122 name=(null) inode=24758 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=123 name=(null) inode=24762 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=124 name=(null) inode=24758 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=125 name=(null) inode=24763 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=126 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=127 name=(null) inode=24764 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=128 name=(null) inode=24764 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=129 name=(null) inode=24765 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=130 name=(null) inode=24764 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=131 name=(null) inode=24766 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=132 name=(null) inode=24764 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=133 name=(null) inode=24767 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=134 name=(null) inode=24764 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=135 name=(null) inode=24768 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=136 name=(null) inode=24764 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=137 name=(null) inode=24769 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=138 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=139 name=(null) inode=24770 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=140 name=(null) inode=24770 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=141 name=(null) inode=24771 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=142 name=(null) inode=24770 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=143 name=(null) inode=24772 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=144 name=(null) inode=24770 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=145 name=(null) inode=24773 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=146 name=(null) inode=24770 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=147 name=(null) inode=24774 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=148 name=(null) inode=24770 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=149 name=(null) inode=24775 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=150 name=(null) inode=24701 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=151 name=(null) inode=24776 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=152 name=(null) inode=24776 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=153 name=(null) inode=24777 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=154 name=(null) inode=24776 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=155 name=(null) inode=24778 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=156 name=(null) inode=24776 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=157 name=(null) inode=24779 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=158 name=(null) inode=24776 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=159 name=(null) inode=24780 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=160 name=(null) inode=24776 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=161 name=(null) inode=24781 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=162 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=163 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=164 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=165 name=(null) inode=24783 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=166 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=167 name=(null) inode=24784 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=168 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=169 name=(null) inode=24785 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=170 name=(null) inode=24785 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=171 name=(null) inode=24786 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=172 name=(null) inode=24785 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=173 name=(null) inode=24787 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=174 name=(null) inode=24785 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=175 name=(null) inode=24788 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=176 name=(null) inode=24785 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=177 name=(null) inode=24789 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=178 name=(null) inode=24785 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=179 name=(null) inode=24790 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=180 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=181 name=(null) inode=24791 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=182 name=(null) inode=24791 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=183 name=(null) inode=24792 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=184 name=(null) inode=24791 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=185 name=(null) inode=24793 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=186 name=(null) inode=24791 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=187 name=(null) inode=24794 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=188 name=(null) inode=24791 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=189 name=(null) inode=24795 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=190 name=(null) inode=24791 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=191 name=(null) inode=24796 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=192 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=193 name=(null) inode=24797 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=194 name=(null) inode=24797 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=195 name=(null) inode=24798 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=196 name=(null) inode=24797 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=197 name=(null) inode=24799 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=198 name=(null) inode=24797 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=199 name=(null) inode=24800 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=200 name=(null) inode=24797 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=201 name=(null) inode=24801 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=202 name=(null) inode=24797 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=203 name=(null) inode=24802 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=204 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=205 name=(null) inode=24803 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=206 name=(null) inode=24803 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=207 name=(null) inode=24804 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=208 name=(null) inode=24803 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=209 name=(null) inode=24805 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=210 name=(null) inode=24803 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=211 name=(null) inode=24806 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=212 name=(null) inode=24803 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=213 name=(null) inode=24807 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=214 name=(null) inode=24803 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=215 name=(null) inode=24808 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=216 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=217 name=(null) inode=24809 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=218 name=(null) inode=24809 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=219 name=(null) inode=24810 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=220 name=(null) inode=24809 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=221 name=(null) inode=24811 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=222 name=(null) inode=24809 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=223 name=(null) inode=24812 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=224 name=(null) inode=24809 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=225 name=(null) inode=24813 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=226 name=(null) inode=24809 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=227 name=(null) inode=24814 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=228 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=229 name=(null) inode=24815 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=230 name=(null) inode=24815 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=231 name=(null) inode=24816 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=232 name=(null) inode=24815 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=233 name=(null) inode=24817 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=234 name=(null) inode=24815 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=235 name=(null) inode=24818 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=236 name=(null) inode=24815 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=237 name=(null) inode=24819 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=238 name=(null) inode=24815 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=239 name=(null) inode=24820 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=240 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=241 name=(null) inode=24821 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=242 name=(null) inode=24821 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=243 name=(null) inode=24822 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=244 name=(null) inode=24821 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=245 name=(null) inode=24823 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=246 name=(null) inode=24821 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=247 name=(null) inode=24824 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=248 name=(null) inode=24821 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=249 name=(null) inode=24825 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=250 name=(null) inode=24821 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=251 name=(null) inode=24826 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=252 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=253 name=(null) inode=24827 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=254 name=(null) inode=24827 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=255 name=(null) inode=24828 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=256 name=(null) inode=24827 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=257 name=(null) inode=24829 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=258 name=(null) inode=24827 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=259 name=(null) inode=24830 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=260 name=(null) inode=24827 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=261 name=(null) inode=24831 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=262 name=(null) inode=24827 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=263 name=(null) inode=24832 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=264 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=265 name=(null) inode=24833 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=266 name=(null) inode=24833 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=267 name=(null) inode=24834 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=268 name=(null) inode=24833 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=269 name=(null) inode=24835 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=270 name=(null) inode=24833 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=271 name=(null) inode=24836 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=272 name=(null) inode=24833 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=273 name=(null) inode=24837 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=274 name=(null) inode=24833 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=275 name=(null) inode=24838 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=276 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=277 name=(null) inode=24839 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=278 name=(null) inode=24839 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=279 name=(null) inode=24840 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=280 name=(null) inode=24839 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=281 name=(null) inode=24841 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=282 name=(null) inode=24839 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=283 name=(null) inode=24842 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=284 name=(null) inode=24839 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=285 name=(null) inode=24843 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=286 name=(null) inode=24839 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=287 name=(null) inode=24844 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=288 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=289 name=(null) inode=24845 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=290 name=(null) inode=24845 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=291 name=(null) inode=24846 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=292 name=(null) inode=24845 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=293 name=(null) inode=24847 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=294 name=(null) inode=24845 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=295 name=(null) inode=24848 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=296 name=(null) inode=24845 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=297 name=(null) inode=24849 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=298 name=(null) inode=24845 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=299 name=(null) inode=24850 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=300 name=(null) inode=24782 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=301 name=(null) inode=24851 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=302 name=(null) inode=24851 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=303 name=(null) inode=24852 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=304 name=(null) inode=24851 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=305 name=(null) inode=24853 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=306 name=(null) inode=24851 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=307 name=(null) inode=24854 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=308 name=(null) inode=24851 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=309 name=(null) inode=24855 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=310 name=(null) inode=24851 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PATH item=311 name=(null) inode=24856 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:17:05.157000 audit: PROCTITLE proctitle="(udev-worker)" Feb 13 08:17:05.529461 systemd[1]: Started systemd-userdbd.service. Feb 13 08:17:05.573252 kernel: ipmi_si: IPMI System Interface driver Feb 13 08:17:05.573280 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Feb 13 08:17:05.573395 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Feb 13 08:17:05.573480 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Feb 13 08:17:05.619312 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Feb 13 08:17:05.665288 kernel: i2c i2c-0: 2/4 memory slots populated (from DMI) Feb 13 08:17:05.665378 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Feb 13 08:17:05.709689 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Feb 13 08:17:05.709769 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Feb 13 08:17:05.724000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:05.734599 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Feb 13 08:17:05.734740 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Feb 13 08:17:05.807483 kernel: iTCO_vendor_support: vendor-support=0 Feb 13 08:17:05.807536 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Feb 13 08:17:05.854388 kernel: ipmi_si: Adding ACPI-specified kcs state machine Feb 13 08:17:05.880583 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Feb 13 08:17:05.912587 systemd-networkd[1419]: bond0: netdev ready Feb 13 08:17:05.915107 systemd-networkd[1419]: lo: Link UP Feb 13 08:17:05.915110 systemd-networkd[1419]: lo: Gained carrier Feb 13 08:17:05.915653 systemd-networkd[1419]: Enumeration completed Feb 13 08:17:05.915748 systemd[1]: Started systemd-networkd.service. Feb 13 08:17:05.915969 systemd-networkd[1419]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Feb 13 08:17:05.921270 systemd-networkd[1419]: enp1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:97:f6:b1.network. Feb 13 08:17:05.956254 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Feb 13 08:17:05.956360 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Feb 13 08:17:05.956447 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Feb 13 08:17:05.979472 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Feb 13 08:17:06.016000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:06.054436 kernel: intel_rapl_common: Found RAPL domain package Feb 13 08:17:06.054470 kernel: intel_rapl_common: Found RAPL domain core Feb 13 08:17:06.074722 kernel: intel_rapl_common: Found RAPL domain dram Feb 13 08:17:06.212474 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Feb 13 08:17:06.234473 kernel: ipmi_ssif: IPMI SSIF Interface driver Feb 13 08:17:06.234896 systemd[1]: Finished systemd-udev-settle.service. Feb 13 08:17:06.242000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:06.244342 systemd[1]: Starting lvm2-activation-early.service... Feb 13 08:17:06.260199 lvm[1444]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 08:17:06.294984 systemd[1]: Finished lvm2-activation-early.service. Feb 13 08:17:06.302000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:06.303670 systemd[1]: Reached target cryptsetup.target. Feb 13 08:17:06.313180 systemd[1]: Starting lvm2-activation.service... Feb 13 08:17:06.315434 lvm[1446]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 08:17:06.354921 systemd[1]: Finished lvm2-activation.service. Feb 13 08:17:06.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:06.363655 systemd[1]: Reached target local-fs-pre.target. Feb 13 08:17:06.371521 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 08:17:06.371536 systemd[1]: Reached target local-fs.target. Feb 13 08:17:06.379553 systemd[1]: Reached target machines.target. Feb 13 08:17:06.389146 systemd[1]: Starting ldconfig.service... Feb 13 08:17:06.395844 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Feb 13 08:17:06.395865 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 08:17:06.396418 systemd[1]: Starting systemd-boot-update.service... Feb 13 08:17:06.404004 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Feb 13 08:17:06.414102 systemd[1]: Starting systemd-machine-id-commit.service... Feb 13 08:17:06.414273 systemd[1]: systemd-sysext.service was skipped because no trigger condition checks were met. Feb 13 08:17:06.414309 systemd[1]: ensure-sysext.service was skipped because no trigger condition checks were met. Feb 13 08:17:06.414892 systemd[1]: Starting systemd-tmpfiles-setup.service... Feb 13 08:17:06.415156 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1449 (bootctl) Feb 13 08:17:06.415821 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Feb 13 08:17:06.423515 systemd-tmpfiles[1453]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 13 08:17:06.435268 systemd-tmpfiles[1453]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 08:17:06.435916 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Feb 13 08:17:06.434000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:06.442204 systemd-tmpfiles[1453]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 08:17:06.484486 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 08:17:06.515488 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Feb 13 08:17:06.516644 systemd-networkd[1419]: enp1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:97:f6:b0.network. Feb 13 08:17:06.581500 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:17:06.699517 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 08:17:06.699716 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:17:06.722520 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Feb 13 08:17:06.743502 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): bond0: link becomes ready Feb 13 08:17:06.764460 systemd-networkd[1419]: bond0: Link UP Feb 13 08:17:06.764860 systemd-networkd[1419]: enp1s0f1np1: Link UP Feb 13 08:17:06.765040 systemd-networkd[1419]: enp1s0f1np1: Gained carrier Feb 13 08:17:06.766115 systemd-networkd[1419]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:97:f6:b0.network. Feb 13 08:17:06.809472 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.830471 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.851489 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.873503 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.895507 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.916502 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.937509 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.958514 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.979513 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:06.998472 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.018471 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.036473 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.037079 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 08:17:07.037434 systemd[1]: Finished systemd-machine-id-commit.service. Feb 13 08:17:07.035000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:07.055912 systemd-fsck[1459]: fsck.fat 4.2 (2021-01-31) Feb 13 08:17:07.055912 systemd-fsck[1459]: /dev/sda1: 789 files, 115339/258078 clusters Feb 13 08:17:07.056472 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.056542 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Feb 13 08:17:07.073000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:07.075471 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.076343 systemd[1]: Mounting boot.mount... Feb 13 08:17:07.089636 systemd[1]: Mounted boot.mount. Feb 13 08:17:07.093516 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.112516 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.119668 systemd[1]: Finished systemd-boot-update.service. Feb 13 08:17:07.130474 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.143000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:07.149472 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.149858 systemd-networkd[1419]: enp1s0f0np0: Link UP Feb 13 08:17:07.150024 systemd-networkd[1419]: bond0: Gained carrier Feb 13 08:17:07.150136 systemd-networkd[1419]: enp1s0f0np0: Gained carrier Feb 13 08:17:07.152494 systemd[1]: Finished systemd-tmpfiles-setup.service. Feb 13 08:17:07.166472 kernel: bond0: (slave enp1s0f1np1): link status down again after 200 ms Feb 13 08:17:07.166497 kernel: bond0: (slave enp1s0f1np1): link status definitely down, disabling slave Feb 13 08:17:07.166511 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:17:07.205000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:07.207329 systemd[1]: Starting audit-rules.service... Feb 13 08:17:07.214472 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Feb 13 08:17:07.214499 kernel: bond0: active interface up! Feb 13 08:17:07.214505 systemd-networkd[1419]: enp1s0f1np1: Link DOWN Feb 13 08:17:07.214507 systemd-networkd[1419]: enp1s0f1np1: Lost carrier Feb 13 08:17:07.222000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Feb 13 08:17:07.222000 audit[1483]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffbc0a9150 a2=420 a3=0 items=0 ppid=1468 pid=1483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:07.222000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Feb 13 08:17:07.224566 augenrules[1483]: No rules Feb 13 08:17:07.233296 systemd[1]: Starting clean-ca-certificates.service... Feb 13 08:17:07.242206 systemd[1]: Starting systemd-journal-catalog-update.service... Feb 13 08:17:07.242776 ldconfig[1448]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 08:17:07.252394 systemd[1]: Starting systemd-resolved.service... Feb 13 08:17:07.260329 systemd[1]: Starting systemd-timesyncd.service... Feb 13 08:17:07.268122 systemd[1]: Starting systemd-update-utmp.service... Feb 13 08:17:07.274889 systemd[1]: Finished ldconfig.service. Feb 13 08:17:07.281721 systemd[1]: Finished audit-rules.service. Feb 13 08:17:07.288703 systemd[1]: Finished clean-ca-certificates.service. Feb 13 08:17:07.296723 systemd[1]: Finished systemd-journal-catalog-update.service. Feb 13 08:17:07.308370 systemd[1]: Starting systemd-update-done.service... Feb 13 08:17:07.315550 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 08:17:07.316032 systemd[1]: Finished systemd-update-done.service. Feb 13 08:17:07.325302 systemd[1]: Finished systemd-update-utmp.service. Feb 13 08:17:07.340993 systemd[1]: Started systemd-timesyncd.service. Feb 13 08:17:07.342756 systemd-resolved[1493]: Positive Trust Anchors: Feb 13 08:17:07.342764 systemd-resolved[1493]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 08:17:07.342792 systemd-resolved[1493]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 08:17:07.346824 systemd-resolved[1493]: Using system hostname 'ci-3510.3.2-a-56b02fc11a'. Feb 13 08:17:07.349737 systemd[1]: Reached target time-set.target. Feb 13 08:17:07.369526 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 08:17:07.373666 systemd-networkd[1419]: enp1s0f1np1: Link UP Feb 13 08:17:07.373830 systemd-networkd[1419]: enp1s0f1np1: Gained carrier Feb 13 08:17:07.374659 systemd[1]: Started systemd-resolved.service. Feb 13 08:17:07.382584 systemd[1]: Reached target network.target. Feb 13 08:17:07.390559 systemd[1]: Reached target nss-lookup.target. Feb 13 08:17:07.398568 systemd[1]: Reached target sysinit.target. Feb 13 08:17:07.406613 systemd[1]: Started motdgen.path. Feb 13 08:17:07.413579 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Feb 13 08:17:07.429083 systemd[1]: Started logrotate.timer. Feb 13 08:17:07.437512 kernel: bond0: (slave enp1s0f1np1): link status up, enabling it in 200 ms Feb 13 08:17:07.437540 kernel: bond0: (slave enp1s0f1np1): invalid new link 3 on slave Feb 13 08:17:07.457608 systemd[1]: Started mdadm.timer. Feb 13 08:17:07.464566 systemd[1]: Started systemd-tmpfiles-clean.timer. Feb 13 08:17:07.472564 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 08:17:07.472581 systemd[1]: Reached target paths.target. Feb 13 08:17:07.479553 systemd[1]: Reached target timers.target. Feb 13 08:17:07.486677 systemd[1]: Listening on dbus.socket. Feb 13 08:17:07.494160 systemd[1]: Starting docker.socket... Feb 13 08:17:07.501373 systemd[1]: Listening on sshd.socket. Feb 13 08:17:07.508641 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 08:17:07.508831 systemd[1]: Listening on docker.socket. Feb 13 08:17:07.515582 systemd[1]: Reached target sockets.target. Feb 13 08:17:07.523552 systemd[1]: Reached target basic.target. Feb 13 08:17:07.530619 systemd[1]: System is tainted: cgroupsv1 Feb 13 08:17:07.530643 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 08:17:07.530655 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 08:17:07.531160 systemd[1]: Starting containerd.service... Feb 13 08:17:07.537998 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Feb 13 08:17:07.547037 systemd[1]: Starting coreos-metadata.service... Feb 13 08:17:07.554165 systemd[1]: Starting dbus.service... Feb 13 08:17:07.560209 systemd[1]: Starting enable-oem-cloudinit.service... Feb 13 08:17:07.565082 jq[1512]: false Feb 13 08:17:07.567164 systemd[1]: Starting extend-filesystems.service... Feb 13 08:17:07.567404 coreos-metadata[1505]: Feb 13 08:17:07.567 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:17:07.573328 dbus-daemon[1511]: [system] SELinux support is enabled Feb 13 08:17:07.573619 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Feb 13 08:17:07.574697 systemd[1]: Starting motdgen.service... Feb 13 08:17:07.575673 extend-filesystems[1514]: Found sda Feb 13 08:17:07.575673 extend-filesystems[1514]: Found sda1 Feb 13 08:17:07.615597 kernel: EXT4-fs (sda9): resizing filesystem from 553472 to 116605649 blocks Feb 13 08:17:07.583338 systemd[1]: Starting prepare-cni-plugins.service... Feb 13 08:17:07.615699 coreos-metadata[1508]: Feb 13 08:17:07.576 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:17:07.615824 extend-filesystems[1514]: Found sda2 Feb 13 08:17:07.615824 extend-filesystems[1514]: Found sda3 Feb 13 08:17:07.615824 extend-filesystems[1514]: Found usr Feb 13 08:17:07.615824 extend-filesystems[1514]: Found sda4 Feb 13 08:17:07.615824 extend-filesystems[1514]: Found sda6 Feb 13 08:17:07.615824 extend-filesystems[1514]: Found sda7 Feb 13 08:17:07.615824 extend-filesystems[1514]: Found sda9 Feb 13 08:17:07.615824 extend-filesystems[1514]: Checking size of /dev/sda9 Feb 13 08:17:07.615824 extend-filesystems[1514]: Resized partition /dev/sda9 Feb 13 08:17:07.742510 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Feb 13 08:17:07.609360 systemd[1]: Starting prepare-critools.service... Feb 13 08:17:07.742719 extend-filesystems[1530]: resize2fs 1.46.5 (30-Dec-2021) Feb 13 08:17:07.624305 systemd[1]: Starting prepare-helm.service... Feb 13 08:17:07.639163 systemd[1]: Starting ssh-key-proc-cmdline.service... Feb 13 08:17:07.662683 systemd[1]: Starting sshd-keygen.service... Feb 13 08:17:07.676626 systemd[1]: Starting systemd-logind.service... Feb 13 08:17:07.690508 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 08:17:07.758016 update_engine[1550]: I0213 08:17:07.749112 1550 main.cc:92] Flatcar Update Engine starting Feb 13 08:17:07.758016 update_engine[1550]: I0213 08:17:07.752087 1550 update_check_scheduler.cc:74] Next update check in 9m57s Feb 13 08:17:07.691214 systemd[1]: Starting tcsd.service... Feb 13 08:17:07.758193 jq[1551]: true Feb 13 08:17:07.702388 systemd-logind[1548]: Watching system buttons on /dev/input/event3 (Power Button) Feb 13 08:17:07.702397 systemd-logind[1548]: Watching system buttons on /dev/input/event2 (Sleep Button) Feb 13 08:17:07.702407 systemd-logind[1548]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Feb 13 08:17:07.702509 systemd-logind[1548]: New seat seat0. Feb 13 08:17:07.703323 systemd[1]: Starting update-engine.service... Feb 13 08:17:07.710112 systemd[1]: Starting update-ssh-keys-after-ignition.service... Feb 13 08:17:07.734904 systemd[1]: Started dbus.service. Feb 13 08:17:07.751233 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 08:17:07.751356 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Feb 13 08:17:07.751499 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 08:17:07.751604 systemd[1]: Finished motdgen.service. Feb 13 08:17:07.765911 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 08:17:07.766074 systemd[1]: Finished ssh-key-proc-cmdline.service. Feb 13 08:17:07.770364 tar[1556]: ./ Feb 13 08:17:07.770364 tar[1556]: ./macvlan Feb 13 08:17:07.778148 jq[1562]: true Feb 13 08:17:07.779052 tar[1557]: crictl Feb 13 08:17:07.779569 dbus-daemon[1511]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 13 08:17:07.780411 tar[1558]: linux-amd64/helm Feb 13 08:17:07.784158 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Feb 13 08:17:07.784317 systemd[1]: Condition check resulted in tcsd.service being skipped. Feb 13 08:17:07.785733 systemd[1]: Started update-engine.service. Feb 13 08:17:07.792941 env[1563]: time="2024-02-13T08:17:07.792912719Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Feb 13 08:17:07.793282 tar[1556]: ./static Feb 13 08:17:07.797595 systemd[1]: Started systemd-logind.service. Feb 13 08:17:07.801348 env[1563]: time="2024-02-13T08:17:07.801331261Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 08:17:07.801404 env[1563]: time="2024-02-13T08:17:07.801394539Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:17:07.802026 env[1563]: time="2024-02-13T08:17:07.801996101Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.148-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 08:17:07.802026 env[1563]: time="2024-02-13T08:17:07.802012141Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:17:07.803814 env[1563]: time="2024-02-13T08:17:07.803768552Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 08:17:07.803814 env[1563]: time="2024-02-13T08:17:07.803786414Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 08:17:07.803814 env[1563]: time="2024-02-13T08:17:07.803798647Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Feb 13 08:17:07.803814 env[1563]: time="2024-02-13T08:17:07.803808214Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 08:17:07.806086 env[1563]: time="2024-02-13T08:17:07.806047415Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:17:07.806225 bash[1588]: Updated "/home/core/.ssh/authorized_keys" Feb 13 08:17:07.806305 env[1563]: time="2024-02-13T08:17:07.806211422Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:17:07.806357 env[1563]: time="2024-02-13T08:17:07.806343770Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 08:17:07.806376 env[1563]: time="2024-02-13T08:17:07.806358684Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 08:17:07.806400 env[1563]: time="2024-02-13T08:17:07.806391746Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Feb 13 08:17:07.806422 env[1563]: time="2024-02-13T08:17:07.806402442Z" level=info msg="metadata content store policy set" policy=shared Feb 13 08:17:07.806812 systemd[1]: Finished update-ssh-keys-after-ignition.service. Feb 13 08:17:07.813512 env[1563]: time="2024-02-13T08:17:07.813463931Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 08:17:07.813512 env[1563]: time="2024-02-13T08:17:07.813495843Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 08:17:07.813512 env[1563]: time="2024-02-13T08:17:07.813507862Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 08:17:07.813585 env[1563]: time="2024-02-13T08:17:07.813528321Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813585 env[1563]: time="2024-02-13T08:17:07.813540011Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813585 env[1563]: time="2024-02-13T08:17:07.813550957Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813585 env[1563]: time="2024-02-13T08:17:07.813562704Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813585 env[1563]: time="2024-02-13T08:17:07.813574332Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813695 env[1563]: time="2024-02-13T08:17:07.813585790Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813695 env[1563]: time="2024-02-13T08:17:07.813597910Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813695 env[1563]: time="2024-02-13T08:17:07.813609991Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.813695 env[1563]: time="2024-02-13T08:17:07.813621672Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 08:17:07.813695 env[1563]: time="2024-02-13T08:17:07.813685534Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 08:17:07.813777 env[1563]: time="2024-02-13T08:17:07.813741656Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 08:17:07.814009 env[1563]: time="2024-02-13T08:17:07.813998412Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 08:17:07.814032 env[1563]: time="2024-02-13T08:17:07.814020897Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814051 env[1563]: time="2024-02-13T08:17:07.814033465Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 08:17:07.814077 env[1563]: time="2024-02-13T08:17:07.814069112Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814097 env[1563]: time="2024-02-13T08:17:07.814080577Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814097 env[1563]: time="2024-02-13T08:17:07.814090735Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814127 env[1563]: time="2024-02-13T08:17:07.814101532Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814127 env[1563]: time="2024-02-13T08:17:07.814112426Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814127 env[1563]: time="2024-02-13T08:17:07.814123140Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814175 env[1563]: time="2024-02-13T08:17:07.814134855Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814175 env[1563]: time="2024-02-13T08:17:07.814145256Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814175 env[1563]: time="2024-02-13T08:17:07.814156733Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 08:17:07.814248 env[1563]: time="2024-02-13T08:17:07.814239697Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814269 env[1563]: time="2024-02-13T08:17:07.814253079Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814269 env[1563]: time="2024-02-13T08:17:07.814263194Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814300 env[1563]: time="2024-02-13T08:17:07.814272103Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 08:17:07.814300 env[1563]: time="2024-02-13T08:17:07.814283309Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Feb 13 08:17:07.814300 env[1563]: time="2024-02-13T08:17:07.814292331Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 08:17:07.814345 env[1563]: time="2024-02-13T08:17:07.814307208Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Feb 13 08:17:07.814345 env[1563]: time="2024-02-13T08:17:07.814339600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 08:17:07.814783 env[1563]: time="2024-02-13T08:17:07.814732074Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.814793487Z" level=info msg="Connect containerd service" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.814819647Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815178911Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815271881Z" level=info msg="Start subscribing containerd event" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815304421Z" level=info msg="Start recovering state" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815326508Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815340124Z" level=info msg="Start event monitor" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815349664Z" level=info msg="Start snapshots syncer" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815355940Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815357439Z" level=info msg="Start cni network conf syncer for default" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815368790Z" level=info msg="Start streaming server" Feb 13 08:17:07.816511 env[1563]: time="2024-02-13T08:17:07.815389260Z" level=info msg="containerd successfully booted in 0.025358s" Feb 13 08:17:07.817760 systemd[1]: Started containerd.service. Feb 13 08:17:07.819159 tar[1556]: ./vlan Feb 13 08:17:07.827178 systemd[1]: Started locksmithd.service. Feb 13 08:17:07.833618 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 08:17:07.833718 systemd[1]: Reached target system-config.target. Feb 13 08:17:07.839233 tar[1556]: ./portmap Feb 13 08:17:07.842594 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 08:17:07.842694 systemd[1]: Reached target user-config.target. Feb 13 08:17:07.858372 tar[1556]: ./host-local Feb 13 08:17:07.876200 tar[1556]: ./vrf Feb 13 08:17:07.886280 locksmithd[1603]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 08:17:07.894422 tar[1556]: ./bridge Feb 13 08:17:07.916280 tar[1556]: ./tuning Feb 13 08:17:07.934505 tar[1556]: ./firewall Feb 13 08:17:07.958121 tar[1556]: ./host-device Feb 13 08:17:07.978689 tar[1556]: ./sbr Feb 13 08:17:07.997482 tar[1556]: ./loopback Feb 13 08:17:08.015342 tar[1556]: ./dhcp Feb 13 08:17:08.039555 systemd[1]: Finished prepare-critools.service. Feb 13 08:17:08.044322 tar[1558]: linux-amd64/LICENSE Feb 13 08:17:08.044362 tar[1558]: linux-amd64/README.md Feb 13 08:17:08.050003 systemd[1]: Finished prepare-helm.service. Feb 13 08:17:08.066600 tar[1556]: ./ptp Feb 13 08:17:08.088777 tar[1556]: ./ipvlan Feb 13 08:17:08.110182 tar[1556]: ./bandwidth Feb 13 08:17:08.131473 kernel: EXT4-fs (sda9): resized filesystem to 116605649 Feb 13 08:17:08.161628 extend-filesystems[1530]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Feb 13 08:17:08.161628 extend-filesystems[1530]: old_desc_blocks = 1, new_desc_blocks = 56 Feb 13 08:17:08.161628 extend-filesystems[1530]: The filesystem on /dev/sda9 is now 116605649 (4k) blocks long. Feb 13 08:17:08.187552 extend-filesystems[1514]: Resized filesystem in /dev/sda9 Feb 13 08:17:08.187552 extend-filesystems[1514]: Found sdb Feb 13 08:17:08.162324 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 08:17:08.162445 systemd[1]: Finished extend-filesystems.service. Feb 13 08:17:08.172911 systemd[1]: Finished prepare-cni-plugins.service. Feb 13 08:17:08.476510 systemd-networkd[1419]: bond0: Gained IPv6LL Feb 13 08:17:09.383567 kernel: mlx5_core 0000:01:00.0: lag map port 1:1 port 2:2 shared_fdb:0 Feb 13 08:17:09.460109 sshd_keygen[1547]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 08:17:09.472265 systemd[1]: Finished sshd-keygen.service. Feb 13 08:17:09.480577 systemd[1]: Starting issuegen.service... Feb 13 08:17:09.487933 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 08:17:09.488047 systemd[1]: Finished issuegen.service. Feb 13 08:17:09.495433 systemd[1]: Starting systemd-user-sessions.service... Feb 13 08:17:09.504850 systemd[1]: Finished systemd-user-sessions.service. Feb 13 08:17:09.514289 systemd[1]: Started getty@tty1.service. Feb 13 08:17:09.522225 systemd[1]: Started serial-getty@ttyS1.service. Feb 13 08:17:09.530693 systemd[1]: Reached target getty.target. Feb 13 08:17:13.469980 coreos-metadata[1505]: Feb 13 08:17:13.469 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 08:17:13.470866 coreos-metadata[1508]: Feb 13 08:17:13.469 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 08:17:14.470104 coreos-metadata[1508]: Feb 13 08:17:14.469 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 08:17:14.470364 coreos-metadata[1505]: Feb 13 08:17:14.469 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 08:17:14.542632 login[1643]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 08:17:14.551207 systemd-logind[1548]: New session 1 of user core. Feb 13 08:17:14.551396 login[1642]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 08:17:14.551787 systemd[1]: Created slice user-500.slice. Feb 13 08:17:14.552285 systemd[1]: Starting user-runtime-dir@500.service... Feb 13 08:17:14.554825 systemd-logind[1548]: New session 2 of user core. Feb 13 08:17:14.557889 systemd[1]: Finished user-runtime-dir@500.service. Feb 13 08:17:14.558602 systemd[1]: Starting user@500.service... Feb 13 08:17:14.560690 (systemd)[1647]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:14.626297 systemd[1647]: Queued start job for default target default.target. Feb 13 08:17:14.626395 systemd[1647]: Reached target paths.target. Feb 13 08:17:14.626405 systemd[1647]: Reached target sockets.target. Feb 13 08:17:14.626412 systemd[1647]: Reached target timers.target. Feb 13 08:17:14.626418 systemd[1647]: Reached target basic.target. Feb 13 08:17:14.626437 systemd[1647]: Reached target default.target. Feb 13 08:17:14.626449 systemd[1647]: Startup finished in 62ms. Feb 13 08:17:14.626516 systemd[1]: Started user@500.service. Feb 13 08:17:14.627129 systemd[1]: Started session-1.scope. Feb 13 08:17:14.627473 systemd[1]: Started session-2.scope. Feb 13 08:17:14.954700 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:2 port 2:2 Feb 13 08:17:14.954844 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:1 port 2:2 Feb 13 08:17:15.522846 systemd[1]: Created slice system-sshd.slice. Feb 13 08:17:15.523547 systemd[1]: Started sshd@0-145.40.67.79:22-139.178.68.195:46160.service. Feb 13 08:17:15.537894 coreos-metadata[1505]: Feb 13 08:17:15.537 INFO Fetch successful Feb 13 08:17:15.538336 coreos-metadata[1508]: Feb 13 08:17:15.538 INFO Fetch successful Feb 13 08:17:15.560349 sshd[1669]: Accepted publickey for core from 139.178.68.195 port 46160 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:15.560569 unknown[1505]: wrote ssh authorized keys file for user: core Feb 13 08:17:15.560842 systemd[1]: Finished coreos-metadata.service. Feb 13 08:17:15.561262 sshd[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:15.561997 systemd[1]: Started packet-phone-home.service. Feb 13 08:17:15.565000 systemd-logind[1548]: New session 3 of user core. Feb 13 08:17:15.565375 systemd[1]: Started session-3.scope. Feb 13 08:17:15.568344 curl[1677]: % Total % Received % Xferd Average Speed Time Time Time Current Feb 13 08:17:15.568463 curl[1677]: Dload Upload Total Spent Left Speed Feb 13 08:17:15.571779 update-ssh-keys[1678]: Updated "/home/core/.ssh/authorized_keys" Feb 13 08:17:15.572044 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Feb 13 08:17:15.572196 systemd[1]: Reached target multi-user.target. Feb 13 08:17:15.572925 systemd[1]: Starting systemd-update-utmp-runlevel.service... Feb 13 08:17:15.576736 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 13 08:17:15.576863 systemd[1]: Finished systemd-update-utmp-runlevel.service. Feb 13 08:17:15.577029 systemd[1]: Startup finished in 21.246s (kernel) + 14.745s (userspace) = 35.991s. Feb 13 08:17:15.156823 systemd-timesyncd[1495]: Contacted time server 209.51.161.238:123 (0.flatcar.pool.ntp.org). Feb 13 08:17:15.175464 systemd-journald[1302]: Time jumped backwards, rotating. Feb 13 08:17:15.156828 systemd-resolved[1493]: Clock change detected. Flushing caches. Feb 13 08:17:15.156850 systemd-timesyncd[1495]: Initial clock synchronization to Tue 2024-02-13 08:17:15.156753 UTC. Feb 13 08:17:15.184878 systemd[1]: Started sshd@1-145.40.67.79:22-139.178.68.195:46162.service. Feb 13 08:17:15.212208 sshd[1687]: Accepted publickey for core from 139.178.68.195 port 46162 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:15.212877 sshd[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:15.215361 systemd-logind[1548]: New session 4 of user core. Feb 13 08:17:15.215746 systemd[1]: Started session-4.scope. Feb 13 08:17:15.280381 sshd[1687]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:15.285230 systemd[1]: Started sshd@2-145.40.67.79:22-139.178.68.195:46166.service. Feb 13 08:17:15.285466 systemd[1]: sshd@1-145.40.67.79:22-139.178.68.195:46162.service: Deactivated successfully. Feb 13 08:17:15.285895 systemd-logind[1548]: Session 4 logged out. Waiting for processes to exit. Feb 13 08:17:15.285934 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 08:17:15.286387 systemd-logind[1548]: Removed session 4. Feb 13 08:17:15.311405 sshd[1692]: Accepted publickey for core from 139.178.68.195 port 46166 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:15.312081 sshd[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:15.314540 systemd-logind[1548]: New session 5 of user core. Feb 13 08:17:15.314905 systemd[1]: Started session-5.scope. Feb 13 08:17:15.365488 sshd[1692]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:15.370600 systemd[1]: Started sshd@3-145.40.67.79:22-139.178.68.195:46174.service. Feb 13 08:17:15.372065 systemd[1]: sshd@2-145.40.67.79:22-139.178.68.195:46166.service: Deactivated successfully. Feb 13 08:17:15.374498 systemd-logind[1548]: Session 5 logged out. Waiting for processes to exit. Feb 13 08:17:15.374570 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 08:17:15.376869 systemd-logind[1548]: Removed session 5. Feb 13 08:17:15.433371 sshd[1700]: Accepted publickey for core from 139.178.68.195 port 46174 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:15.436107 sshd[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:15.445279 systemd-logind[1548]: New session 6 of user core. Feb 13 08:17:15.447350 systemd[1]: Started session-6.scope. Feb 13 08:17:15.462144 curl[1677]: \u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 Feb 13 08:17:15.464146 systemd[1]: packet-phone-home.service: Deactivated successfully. Feb 13 08:17:15.513866 sshd[1700]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:15.515749 systemd[1]: Started sshd@4-145.40.67.79:22-139.178.68.195:46186.service. Feb 13 08:17:15.516237 systemd[1]: sshd@3-145.40.67.79:22-139.178.68.195:46174.service: Deactivated successfully. Feb 13 08:17:15.516936 systemd-logind[1548]: Session 6 logged out. Waiting for processes to exit. Feb 13 08:17:15.517032 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 08:17:15.517847 systemd-logind[1548]: Removed session 6. Feb 13 08:17:15.543852 sshd[1709]: Accepted publickey for core from 139.178.68.195 port 46186 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:15.544661 sshd[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:15.547662 systemd-logind[1548]: New session 7 of user core. Feb 13 08:17:15.548195 systemd[1]: Started session-7.scope. Feb 13 08:17:15.627146 sudo[1714]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 08:17:15.627706 sudo[1714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:17:15.644399 dbus-daemon[1511]: \xd0\u000d;\xa1QV: received setenforce notice (enforcing=-593589920) Feb 13 08:17:15.649106 sudo[1714]: pam_unix(sudo:session): session closed for user root Feb 13 08:17:15.653810 sshd[1709]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:15.659311 systemd[1]: Started sshd@5-145.40.67.79:22-139.178.68.195:46194.service. Feb 13 08:17:15.660840 systemd[1]: sshd@4-145.40.67.79:22-139.178.68.195:46186.service: Deactivated successfully. Feb 13 08:17:15.663042 systemd-logind[1548]: Session 7 logged out. Waiting for processes to exit. Feb 13 08:17:15.663168 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 08:17:15.665571 systemd-logind[1548]: Removed session 7. Feb 13 08:17:15.713822 sshd[1716]: Accepted publickey for core from 139.178.68.195 port 46194 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:15.715797 sshd[1716]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:15.722821 systemd-logind[1548]: New session 8 of user core. Feb 13 08:17:15.724208 systemd[1]: Started session-8.scope. Feb 13 08:17:15.795869 sudo[1723]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 08:17:15.796499 sudo[1723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:17:15.803937 sudo[1723]: pam_unix(sudo:session): session closed for user root Feb 13 08:17:15.816238 sudo[1722]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Feb 13 08:17:15.817154 sudo[1722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:17:15.840520 systemd[1]: Stopping audit-rules.service... Feb 13 08:17:15.842000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 08:17:15.843822 auditctl[1726]: No rules Feb 13 08:17:15.844552 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 08:17:15.845094 systemd[1]: Stopped audit-rules.service. Feb 13 08:17:15.848712 systemd[1]: Starting audit-rules.service... Feb 13 08:17:15.849288 kernel: kauditd_printk_skb: 326 callbacks suppressed Feb 13 08:17:15.849324 kernel: audit: type=1305 audit(1707812235.842:131): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 08:17:15.857966 augenrules[1744]: No rules Feb 13 08:17:15.858330 systemd[1]: Finished audit-rules.service. Feb 13 08:17:15.858737 sudo[1722]: pam_unix(sudo:session): session closed for user root Feb 13 08:17:15.859525 sshd[1716]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:15.860804 systemd[1]: sshd@5-145.40.67.79:22-139.178.68.195:46194.service: Deactivated successfully. Feb 13 08:17:15.861345 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 08:17:15.861352 systemd-logind[1548]: Session 8 logged out. Waiting for processes to exit. Feb 13 08:17:15.861871 systemd-logind[1548]: Removed session 8. Feb 13 08:17:15.842000 audit[1726]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd39699680 a2=420 a3=0 items=0 ppid=1 pid=1726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:15.895872 kernel: audit: type=1300 audit(1707812235.842:131): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd39699680 a2=420 a3=0 items=0 ppid=1 pid=1726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:15.895896 kernel: audit: type=1327 audit(1707812235.842:131): proctitle=2F7362696E2F617564697463746C002D44 Feb 13 08:17:15.842000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Feb 13 08:17:15.897171 systemd[1]: Started sshd@6-145.40.67.79:22-139.178.68.195:46210.service. Feb 13 08:17:15.905448 kernel: audit: type=1131 audit(1707812235.844:132): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.844000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.927914 kernel: audit: type=1130 audit(1707812235.857:133): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.857000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.950357 kernel: audit: type=1106 audit(1707812235.857:134): pid=1722 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.857000 audit[1722]: USER_END pid=1722 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.976325 kernel: audit: type=1104 audit(1707812235.857:135): pid=1722 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.857000 audit[1722]: CRED_DISP pid=1722 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.999874 kernel: audit: type=1106 audit(1707812235.859:136): pid=1716 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:15.859000 audit[1716]: USER_END pid=1716 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:16.032081 kernel: audit: type=1104 audit(1707812235.859:137): pid=1716 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:15.859000 audit[1716]: CRED_DISP pid=1716 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:16.058077 kernel: audit: type=1131 audit(1707812235.859:138): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-145.40.67.79:22-139.178.68.195:46194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:15.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-145.40.67.79:22-139.178.68.195:46194 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:16.063590 sshd[1751]: Accepted publickey for core from 139.178.68.195 port 46210 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:17:16.064280 sshd[1751]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:17:16.066609 systemd-logind[1548]: New session 9 of user core. Feb 13 08:17:16.067133 systemd[1]: Started session-9.scope. Feb 13 08:17:15.896000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.67.79:22-139.178.68.195:46210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:16.062000 audit[1751]: USER_ACCT pid=1751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:16.063000 audit[1751]: CRED_ACQ pid=1751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:16.063000 audit[1751]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe27fe21b0 a2=3 a3=0 items=0 ppid=1 pid=1751 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:16.063000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:17:16.069000 audit[1751]: USER_START pid=1751 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:16.069000 audit[1754]: CRED_ACQ pid=1754 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:16.127000 audit[1755]: USER_ACCT pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:16.128000 audit[1755]: CRED_REFR pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:16.128993 sudo[1755]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 08:17:16.129120 sudo[1755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:17:16.129000 audit[1755]: USER_START pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:20.173178 systemd[1]: Starting systemd-networkd-wait-online.service... Feb 13 08:17:20.187885 systemd[1]: Finished systemd-networkd-wait-online.service. Feb 13 08:17:20.186000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:20.188143 systemd[1]: Reached target network-online.target. Feb 13 08:17:20.189067 systemd[1]: Starting docker.service... Feb 13 08:17:20.208152 env[1776]: time="2024-02-13T08:17:20.208093338Z" level=info msg="Starting up" Feb 13 08:17:20.208807 env[1776]: time="2024-02-13T08:17:20.208768421Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 08:17:20.208807 env[1776]: time="2024-02-13T08:17:20.208777963Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 08:17:20.208807 env[1776]: time="2024-02-13T08:17:20.208789757Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 08:17:20.208807 env[1776]: time="2024-02-13T08:17:20.208796130Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 08:17:20.209600 env[1776]: time="2024-02-13T08:17:20.209560254Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 08:17:20.209600 env[1776]: time="2024-02-13T08:17:20.209569977Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 08:17:20.209600 env[1776]: time="2024-02-13T08:17:20.209579769Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 08:17:20.209600 env[1776]: time="2024-02-13T08:17:20.209585427Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 08:17:20.626753 env[1776]: time="2024-02-13T08:17:20.626636794Z" level=warning msg="Your kernel does not support cgroup blkio weight" Feb 13 08:17:20.626753 env[1776]: time="2024-02-13T08:17:20.626687479Z" level=warning msg="Your kernel does not support cgroup blkio weight_device" Feb 13 08:17:20.627218 env[1776]: time="2024-02-13T08:17:20.627129785Z" level=info msg="Loading containers: start." Feb 13 08:17:20.680000 audit[1818]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1818 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.680000 audit[1818]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffc142771d0 a2=0 a3=7ffc142771bc items=0 ppid=1776 pid=1818 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.680000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Feb 13 08:17:20.681000 audit[1820]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1820 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.681000 audit[1820]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffdb4a1da80 a2=0 a3=7ffdb4a1da6c items=0 ppid=1776 pid=1820 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.681000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Feb 13 08:17:20.681000 audit[1822]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1822 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.681000 audit[1822]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffda5964ef0 a2=0 a3=7ffda5964edc items=0 ppid=1776 pid=1822 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.681000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 08:17:20.682000 audit[1824]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1824 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.682000 audit[1824]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff39c9bb00 a2=0 a3=7fff39c9baec items=0 ppid=1776 pid=1824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.682000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 08:17:20.684000 audit[1826]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1826 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.684000 audit[1826]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffcc81f2a60 a2=0 a3=7ffcc81f2a4c items=0 ppid=1776 pid=1826 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.684000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Feb 13 08:17:20.706000 audit[1831]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1831 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.706000 audit[1831]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff748ee9e0 a2=0 a3=7fff748ee9cc items=0 ppid=1776 pid=1831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.706000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Feb 13 08:17:20.709000 audit[1833]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1833 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.709000 audit[1833]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd6d7bcd10 a2=0 a3=7ffd6d7bccfc items=0 ppid=1776 pid=1833 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.709000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Feb 13 08:17:20.710000 audit[1835]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1835 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.710000 audit[1835]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc143dca30 a2=0 a3=7ffc143dca1c items=0 ppid=1776 pid=1835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.710000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Feb 13 08:17:20.712000 audit[1837]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1837 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.712000 audit[1837]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7fff1e3bc660 a2=0 a3=7fff1e3bc64c items=0 ppid=1776 pid=1837 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.712000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:17:20.716000 audit[1841]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1841 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.716000 audit[1841]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffcf4895620 a2=0 a3=7ffcf489560c items=0 ppid=1776 pid=1841 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.716000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:17:20.716000 audit[1843]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1843 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.716000 audit[1843]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fff4feaabb0 a2=0 a3=7fff4feaab9c items=0 ppid=1776 pid=1843 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.716000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:17:20.729074 kernel: Initializing XFRM netlink socket Feb 13 08:17:20.750899 env[1776]: time="2024-02-13T08:17:20.750882491Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Feb 13 08:17:20.761000 audit[1851]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1851 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.761000 audit[1851]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffdc1139670 a2=0 a3=7ffdc113965c items=0 ppid=1776 pid=1851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.761000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Feb 13 08:17:20.777000 audit[1854]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1854 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.777000 audit[1854]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffc34f51ab0 a2=0 a3=7ffc34f51a9c items=0 ppid=1776 pid=1854 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.777000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Feb 13 08:17:20.779000 audit[1857]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1857 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.779000 audit[1857]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffe429ccf60 a2=0 a3=7ffe429ccf4c items=0 ppid=1776 pid=1857 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.779000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Feb 13 08:17:20.780000 audit[1859]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1859 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.780000 audit[1859]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffdc89c08a0 a2=0 a3=7ffdc89c088c items=0 ppid=1776 pid=1859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.780000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Feb 13 08:17:20.781000 audit[1861]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1861 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.781000 audit[1861]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffffd90fa90 a2=0 a3=7ffffd90fa7c items=0 ppid=1776 pid=1861 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.781000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Feb 13 08:17:20.783000 audit[1863]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1863 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.783000 audit[1863]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffceaeeaf30 a2=0 a3=7ffceaeeaf1c items=0 ppid=1776 pid=1863 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.783000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Feb 13 08:17:20.784000 audit[1865]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1865 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.784000 audit[1865]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7ffc96dfc2e0 a2=0 a3=7ffc96dfc2cc items=0 ppid=1776 pid=1865 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.784000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Feb 13 08:17:20.792000 audit[1868]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1868 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.792000 audit[1868]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffd78c56830 a2=0 a3=7ffd78c5681c items=0 ppid=1776 pid=1868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.792000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Feb 13 08:17:20.793000 audit[1870]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1870 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.793000 audit[1870]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7ffc997fd5e0 a2=0 a3=7ffc997fd5cc items=0 ppid=1776 pid=1870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.793000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 08:17:20.795000 audit[1872]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1872 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.795000 audit[1872]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fffb6fbdb80 a2=0 a3=7fffb6fbdb6c items=0 ppid=1776 pid=1872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.795000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 08:17:20.797000 audit[1874]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1874 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.797000 audit[1874]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fffc1221710 a2=0 a3=7fffc12216fc items=0 ppid=1776 pid=1874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.797000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Feb 13 08:17:20.798606 systemd-networkd[1419]: docker0: Link UP Feb 13 08:17:20.819000 audit[1878]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1878 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.819000 audit[1878]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffde1cdcec0 a2=0 a3=7ffde1cdceac items=0 ppid=1776 pid=1878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.819000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:17:20.821000 audit[1879]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1879 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:20.821000 audit[1879]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffe00a4ad70 a2=0 a3=7ffe00a4ad5c items=0 ppid=1776 pid=1879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:20.821000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:17:20.823234 env[1776]: time="2024-02-13T08:17:20.823135189Z" level=info msg="Loading containers: done." Feb 13 08:17:20.842458 env[1776]: time="2024-02-13T08:17:20.842437301Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 08:17:20.842560 env[1776]: time="2024-02-13T08:17:20.842546941Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Feb 13 08:17:20.842615 env[1776]: time="2024-02-13T08:17:20.842604758Z" level=info msg="Daemon has completed initialization" Feb 13 08:17:20.843542 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1445136563-merged.mount: Deactivated successfully. Feb 13 08:17:20.849007 systemd[1]: Started docker.service. Feb 13 08:17:20.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:20.851962 env[1776]: time="2024-02-13T08:17:20.851913138Z" level=info msg="API listen on /run/docker.sock" Feb 13 08:17:20.861823 systemd[1]: Reloading. Feb 13 08:17:20.870058 kernel: kauditd_printk_skb: 84 callbacks suppressed Feb 13 08:17:20.870096 kernel: audit: type=1130 audit(1707812240.848:173): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:20.896319 /usr/lib/systemd/system-generators/torcx-generator[1930]: time="2024-02-13T08:17:20Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:17:20.896335 /usr/lib/systemd/system-generators/torcx-generator[1930]: time="2024-02-13T08:17:20Z" level=info msg="torcx already run" Feb 13 08:17:20.948615 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:17:20.948622 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:17:20.959471 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:17:21.012180 systemd[1]: Started kubelet.service. Feb 13 08:17:21.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:21.034642 kubelet[1994]: E0213 08:17:21.034584 1994 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 13 08:17:21.035925 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 08:17:21.036031 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 08:17:21.035000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:17:21.068056 kernel: audit: type=1130 audit(1707812241.011:174): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:21.068087 kernel: audit: type=1131 audit(1707812241.035:175): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:17:21.791017 env[1563]: time="2024-02-13T08:17:21.790880627Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\"" Feb 13 08:17:22.422904 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1391056860.mount: Deactivated successfully. Feb 13 08:17:23.749771 env[1563]: time="2024-02-13T08:17:23.749746065Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:23.750349 env[1563]: time="2024-02-13T08:17:23.750310032Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:23.751365 env[1563]: time="2024-02-13T08:17:23.751319692Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:23.752706 env[1563]: time="2024-02-13T08:17:23.752664994Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:2f28bed4096abd572a56595ac0304238bdc271dcfe22c650707c09bf97ec16fd,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:23.753018 env[1563]: time="2024-02-13T08:17:23.752956467Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\" returns image reference \"sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f\"" Feb 13 08:17:23.758331 env[1563]: time="2024-02-13T08:17:23.758262125Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\"" Feb 13 08:17:25.312404 env[1563]: time="2024-02-13T08:17:25.312349056Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:25.312975 env[1563]: time="2024-02-13T08:17:25.312934166Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:25.313987 env[1563]: time="2024-02-13T08:17:25.313947633Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:25.315109 env[1563]: time="2024-02-13T08:17:25.315066939Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:fda420c6c15cdd01c4eba3404f0662fe486a9c7f38fa13c741a21334673841a2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:25.315882 env[1563]: time="2024-02-13T08:17:25.315841498Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\" returns image reference \"sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486\"" Feb 13 08:17:25.322013 env[1563]: time="2024-02-13T08:17:25.321995557Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\"" Feb 13 08:17:26.318832 env[1563]: time="2024-02-13T08:17:26.318800399Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:26.319624 env[1563]: time="2024-02-13T08:17:26.319575489Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:26.320475 env[1563]: time="2024-02-13T08:17:26.320424486Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:26.321802 env[1563]: time="2024-02-13T08:17:26.321758609Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c3c7303ee6d01c8e5a769db28661cf854b55175aa72c67e9b6a7b9d47ac42af3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:26.322085 env[1563]: time="2024-02-13T08:17:26.322042840Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\" returns image reference \"sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e\"" Feb 13 08:17:26.327553 env[1563]: time="2024-02-13T08:17:26.327503735Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\"" Feb 13 08:17:27.232868 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount130692831.mount: Deactivated successfully. Feb 13 08:17:27.522391 env[1563]: time="2024-02-13T08:17:27.522320404Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:27.522951 env[1563]: time="2024-02-13T08:17:27.522938177Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:27.523780 env[1563]: time="2024-02-13T08:17:27.523730239Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:27.524563 env[1563]: time="2024-02-13T08:17:27.524504818Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:f6e0de32a002b910b9b2e0e8d769e2d7b05208240559c745ce4781082ab15f22,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:27.524893 env[1563]: time="2024-02-13T08:17:27.524879003Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\" returns image reference \"sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f\"" Feb 13 08:17:27.531678 env[1563]: time="2024-02-13T08:17:27.531660280Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 13 08:17:28.067179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3328451363.mount: Deactivated successfully. Feb 13 08:17:28.068482 env[1563]: time="2024-02-13T08:17:28.068441702Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:28.069111 env[1563]: time="2024-02-13T08:17:28.069071983Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:28.069812 env[1563]: time="2024-02-13T08:17:28.069770407Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:28.070569 env[1563]: time="2024-02-13T08:17:28.070528067Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:28.070872 env[1563]: time="2024-02-13T08:17:28.070824793Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 13 08:17:28.076281 env[1563]: time="2024-02-13T08:17:28.076225915Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\"" Feb 13 08:17:28.771168 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount165394961.mount: Deactivated successfully. Feb 13 08:17:31.276290 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 08:17:31.275000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:31.276472 systemd[1]: Stopped kubelet.service. Feb 13 08:17:31.277753 systemd[1]: Started kubelet.service. Feb 13 08:17:31.309662 kubelet[2090]: E0213 08:17:31.309581 2090 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 13 08:17:31.311831 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 08:17:31.311971 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 08:17:31.275000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:31.396415 kernel: audit: type=1130 audit(1707812251.275:176): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:31.396446 kernel: audit: type=1131 audit(1707812251.275:177): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:31.396461 kernel: audit: type=1130 audit(1707812251.276:178): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:31.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:31.457236 kernel: audit: type=1131 audit(1707812251.311:179): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:17:31.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:17:31.679464 env[1563]: time="2024-02-13T08:17:31.679382836Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:31.680145 env[1563]: time="2024-02-13T08:17:31.680105973Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:31.680906 env[1563]: time="2024-02-13T08:17:31.680865482Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:31.681697 env[1563]: time="2024-02-13T08:17:31.681653589Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:31.682107 env[1563]: time="2024-02-13T08:17:31.682058615Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\" returns image reference \"sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7\"" Feb 13 08:17:31.687859 env[1563]: time="2024-02-13T08:17:31.687843369Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\"" Feb 13 08:17:32.227790 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3001041849.mount: Deactivated successfully. Feb 13 08:17:32.623857 env[1563]: time="2024-02-13T08:17:32.623776872Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:32.624433 env[1563]: time="2024-02-13T08:17:32.624392173Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:32.624998 env[1563]: time="2024-02-13T08:17:32.624942100Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:32.625613 env[1563]: time="2024-02-13T08:17:32.625573491Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:32.625954 env[1563]: time="2024-02-13T08:17:32.625913787Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\" returns image reference \"sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a\"" Feb 13 08:17:34.426091 systemd[1]: Stopped kubelet.service. Feb 13 08:17:34.425000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:34.425000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:34.497439 systemd[1]: Reloading. Feb 13 08:17:34.523261 /usr/lib/systemd/system-generators/torcx-generator[2234]: time="2024-02-13T08:17:34Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:17:34.523284 /usr/lib/systemd/system-generators/torcx-generator[2234]: time="2024-02-13T08:17:34Z" level=info msg="torcx already run" Feb 13 08:17:34.554268 kernel: audit: type=1130 audit(1707812254.425:180): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:34.554319 kernel: audit: type=1131 audit(1707812254.425:181): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:34.577643 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:17:34.577650 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:17:34.588591 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:17:34.644039 systemd[1]: Started kubelet.service. Feb 13 08:17:34.643000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:34.665805 kubelet[2299]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:17:34.665805 kubelet[2299]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:17:34.665805 kubelet[2299]: I0213 08:17:34.665791 2299 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 08:17:34.666573 kubelet[2299]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:17:34.666573 kubelet[2299]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:17:34.709091 kernel: audit: type=1130 audit(1707812254.643:182): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:34.835865 kubelet[2299]: I0213 08:17:34.835854 2299 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 13 08:17:34.835865 kubelet[2299]: I0213 08:17:34.835864 2299 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 08:17:34.835976 kubelet[2299]: I0213 08:17:34.835971 2299 server.go:836] "Client rotation is on, will bootstrap in background" Feb 13 08:17:34.837390 kubelet[2299]: I0213 08:17:34.837335 2299 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 08:17:34.837774 kubelet[2299]: E0213 08:17:34.837735 2299 certificate_manager.go:471] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://145.40.67.79:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.857468 kubelet[2299]: I0213 08:17:34.857461 2299 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 08:17:34.857648 kubelet[2299]: I0213 08:17:34.857642 2299 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 08:17:34.857686 kubelet[2299]: I0213 08:17:34.857681 2299 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:cgroupfs KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 08:17:34.857742 kubelet[2299]: I0213 08:17:34.857692 2299 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 08:17:34.857742 kubelet[2299]: I0213 08:17:34.857699 2299 container_manager_linux.go:308] "Creating device plugin manager" Feb 13 08:17:34.857780 kubelet[2299]: I0213 08:17:34.857745 2299 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:17:34.859178 kubelet[2299]: I0213 08:17:34.859136 2299 kubelet.go:398] "Attempting to sync node with API server" Feb 13 08:17:34.859178 kubelet[2299]: I0213 08:17:34.859146 2299 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 08:17:34.859178 kubelet[2299]: I0213 08:17:34.859158 2299 kubelet.go:297] "Adding apiserver pod source" Feb 13 08:17:34.859178 kubelet[2299]: I0213 08:17:34.859166 2299 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 08:17:34.859434 kubelet[2299]: I0213 08:17:34.859427 2299 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 08:17:34.859462 kubelet[2299]: W0213 08:17:34.859440 2299 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://145.40.67.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.859481 kubelet[2299]: E0213 08:17:34.859469 2299 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://145.40.67.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.859527 kubelet[2299]: W0213 08:17:34.859503 2299 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://145.40.67.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-56b02fc11a&limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.859547 kubelet[2299]: E0213 08:17:34.859536 2299 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://145.40.67.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-56b02fc11a&limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.859588 kubelet[2299]: W0213 08:17:34.859579 2299 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 08:17:34.859793 kubelet[2299]: I0213 08:17:34.859788 2299 server.go:1186] "Started kubelet" Feb 13 08:17:34.859903 kubelet[2299]: I0213 08:17:34.859893 2299 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 08:17:34.860048 kubelet[2299]: E0213 08:17:34.860001 2299 event.go:276] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-56b02fc11a.17b35e3289f9d726", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-56b02fc11a", UID:"ci-3510.3.2-a-56b02fc11a", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-56b02fc11a"}, FirstTimestamp:time.Date(2024, time.February, 13, 8, 17, 34, 859777830, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 8, 17, 34, 859777830, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'Post "https://145.40.67.79:6443/api/v1/namespaces/default/events": dial tcp 145.40.67.79:6443: connect: connection refused'(may retry after sleeping) Feb 13 08:17:34.860048 kubelet[2299]: E0213 08:17:34.860040 2299 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 08:17:34.860125 kubelet[2299]: E0213 08:17:34.860057 2299 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 08:17:34.859000 audit[2299]: AVC avc: denied { mac_admin } for pid=2299 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:34.860503 kubelet[2299]: I0213 08:17:34.860459 2299 server.go:451] "Adding debug handlers to kubelet server" Feb 13 08:17:34.860503 kubelet[2299]: I0213 08:17:34.860462 2299 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 08:17:34.860503 kubelet[2299]: I0213 08:17:34.860483 2299 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 08:17:34.860590 kubelet[2299]: I0213 08:17:34.860510 2299 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 08:17:34.860590 kubelet[2299]: I0213 08:17:34.860547 2299 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 13 08:17:34.860590 kubelet[2299]: I0213 08:17:34.860573 2299 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 08:17:34.861027 kubelet[2299]: W0213 08:17:34.860989 2299 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://145.40.67.79:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.861027 kubelet[2299]: E0213 08:17:34.861003 2299 controller.go:146] failed to ensure lease exists, will retry in 200ms, error: Get "https://145.40.67.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-56b02fc11a?timeout=10s": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.861115 kubelet[2299]: E0213 08:17:34.861035 2299 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://145.40.67.79:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:34.922547 kubelet[2299]: I0213 08:17:34.922503 2299 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 08:17:34.922547 kubelet[2299]: I0213 08:17:34.922516 2299 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 08:17:34.922547 kubelet[2299]: I0213 08:17:34.922523 2299 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:17:34.923482 kubelet[2299]: I0213 08:17:34.923445 2299 policy_none.go:49] "None policy: Start" Feb 13 08:17:34.923729 kubelet[2299]: I0213 08:17:34.923694 2299 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 08:17:34.923729 kubelet[2299]: I0213 08:17:34.923704 2299 state_mem.go:35] "Initializing new in-memory state store" Feb 13 08:17:34.859000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:34.958516 kernel: audit: type=1400 audit(1707812254.859:183): avc: denied { mac_admin } for pid=2299 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:34.958542 kernel: audit: type=1401 audit(1707812254.859:183): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:34.958588 kernel: audit: type=1300 audit(1707812254.859:183): arch=c000003e syscall=188 success=no exit=-22 a0=c00059c900 a1=c000596660 a2=c00059c8d0 a3=25 items=0 ppid=1 pid=2299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.859000 audit[2299]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c00059c900 a1=c000596660 a2=c00059c8d0 a3=25 items=0 ppid=1 pid=2299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.961466 kubelet[2299]: I0213 08:17:34.961458 2299 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:34.961681 kubelet[2299]: E0213 08:17:34.961671 2299 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.79:6443/api/v1/nodes\": dial tcp 145.40.67.79:6443: connect: connection refused" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:34.859000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:34.859000 audit[2299]: AVC avc: denied { mac_admin } for pid=2299 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:34.859000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:34.859000 audit[2299]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0005ae560 a1=c000596678 a2=c00059c990 a3=25 items=0 ppid=1 pid=2299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.859000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:34.861000 audit[2323]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=2323 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:34.861000 audit[2323]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffce70755a0 a2=0 a3=7ffce707558c items=0 ppid=2299 pid=2323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.861000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 08:17:34.861000 audit[2324]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=2324 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:34.861000 audit[2324]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe78e33fd0 a2=0 a3=7ffe78e33fbc items=0 ppid=2299 pid=2324 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.861000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 08:17:34.862000 audit[2326]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=2326 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:34.862000 audit[2326]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd2b159610 a2=0 a3=7ffd2b1595fc items=0 ppid=2299 pid=2326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.862000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 08:17:34.863000 audit[2328]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=2328 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:34.863000 audit[2328]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff06b7cc70 a2=0 a3=7fff06b7cc5c items=0 ppid=2299 pid=2328 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:34.863000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 08:17:35.053000 audit[2331]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2331 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.053000 audit[2331]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fff99ff0560 a2=0 a3=7fff99ff054c items=0 ppid=2299 pid=2331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.053000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Feb 13 08:17:35.055215 kubelet[2299]: I0213 08:17:35.055206 2299 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 08:17:35.054000 audit[2299]: AVC avc: denied { mac_admin } for pid=2299 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:35.054000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:35.054000 audit[2299]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c00188c660 a1=c0018866f0 a2=c00188c630 a3=25 items=0 ppid=1 pid=2299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.054000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:35.055379 kubelet[2299]: I0213 08:17:35.055236 2299 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 08:17:35.055379 kubelet[2299]: I0213 08:17:35.055323 2299 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 08:17:35.054000 audit[2332]: NETFILTER_CFG table=nat:31 family=2 entries=1 op=nft_register_chain pid=2332 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.054000 audit[2332]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff89067710 a2=0 a3=7fff890676fc items=0 ppid=2299 pid=2332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.054000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 13 08:17:35.055578 kubelet[2299]: E0213 08:17:35.055517 2299 eviction_manager.go:261] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:35.061591 kubelet[2299]: E0213 08:17:35.061516 2299 controller.go:146] failed to ensure lease exists, will retry in 400ms, error: Get "https://145.40.67.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-56b02fc11a?timeout=10s": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.070000 audit[2336]: NETFILTER_CFG table=nat:32 family=2 entries=1 op=nft_register_rule pid=2336 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.070000 audit[2336]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffee85f14c0 a2=0 a3=7ffee85f14ac items=0 ppid=2299 pid=2336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.070000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 13 08:17:35.075000 audit[2339]: NETFILTER_CFG table=filter:33 family=2 entries=1 op=nft_register_rule pid=2339 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.075000 audit[2339]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7ffe76069210 a2=0 a3=7ffe760691fc items=0 ppid=2299 pid=2339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.075000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 13 08:17:35.076000 audit[2340]: NETFILTER_CFG table=nat:34 family=2 entries=1 op=nft_register_chain pid=2340 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.076000 audit[2340]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff9c5ecc10 a2=0 a3=7fff9c5ecbfc items=0 ppid=2299 pid=2340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.076000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 13 08:17:35.077000 audit[2341]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_chain pid=2341 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.077000 audit[2341]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff12c05820 a2=0 a3=7fff12c0580c items=0 ppid=2299 pid=2341 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.077000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 08:17:35.079000 audit[2343]: NETFILTER_CFG table=nat:36 family=2 entries=1 op=nft_register_rule pid=2343 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.079000 audit[2343]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffe9e596030 a2=0 a3=7ffe9e59601c items=0 ppid=2299 pid=2343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.079000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 13 08:17:35.082000 audit[2345]: NETFILTER_CFG table=nat:37 family=2 entries=1 op=nft_register_rule pid=2345 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.082000 audit[2345]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffeec2667c0 a2=0 a3=7ffeec2667ac items=0 ppid=2299 pid=2345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.082000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 08:17:35.084000 audit[2347]: NETFILTER_CFG table=nat:38 family=2 entries=1 op=nft_register_rule pid=2347 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.084000 audit[2347]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7ffeebd0c280 a2=0 a3=7ffeebd0c26c items=0 ppid=2299 pid=2347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.084000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 13 08:17:35.087000 audit[2349]: NETFILTER_CFG table=nat:39 family=2 entries=1 op=nft_register_rule pid=2349 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.087000 audit[2349]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffd334d1920 a2=0 a3=7ffd334d190c items=0 ppid=2299 pid=2349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.087000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 13 08:17:35.088000 audit[2351]: NETFILTER_CFG table=nat:40 family=2 entries=1 op=nft_register_rule pid=2351 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.088000 audit[2351]: SYSCALL arch=c000003e syscall=46 success=yes exit=540 a0=3 a1=7ffee6cba440 a2=0 a3=7ffee6cba42c items=0 ppid=2299 pid=2351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.088000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 13 08:17:35.089585 kubelet[2299]: I0213 08:17:35.089539 2299 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 08:17:35.088000 audit[2352]: NETFILTER_CFG table=mangle:41 family=10 entries=2 op=nft_register_chain pid=2352 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.088000 audit[2352]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff3e43b350 a2=0 a3=7fff3e43b33c items=0 ppid=2299 pid=2352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.088000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 08:17:35.089000 audit[2353]: NETFILTER_CFG table=mangle:42 family=2 entries=1 op=nft_register_chain pid=2353 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.089000 audit[2353]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffd38ff9f0 a2=0 a3=7fffd38ff9dc items=0 ppid=2299 pid=2353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.089000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 08:17:35.089000 audit[2354]: NETFILTER_CFG table=nat:43 family=10 entries=2 op=nft_register_chain pid=2354 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.089000 audit[2354]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffdbffb4930 a2=0 a3=7ffdbffb491c items=0 ppid=2299 pid=2354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.089000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 13 08:17:35.089000 audit[2355]: NETFILTER_CFG table=nat:44 family=2 entries=1 op=nft_register_chain pid=2355 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.089000 audit[2355]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffce89e0580 a2=0 a3=7ffce89e056c items=0 ppid=2299 pid=2355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.089000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 08:17:35.090000 audit[2357]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_chain pid=2357 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:35.090000 audit[2357]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff0ff5fcd0 a2=0 a3=7fff0ff5fcbc items=0 ppid=2299 pid=2357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.090000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 08:17:35.090000 audit[2358]: NETFILTER_CFG table=nat:46 family=10 entries=1 op=nft_register_rule pid=2358 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.090000 audit[2358]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7fff9b1a6d90 a2=0 a3=7fff9b1a6d7c items=0 ppid=2299 pid=2358 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.090000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 13 08:17:35.090000 audit[2359]: NETFILTER_CFG table=filter:47 family=10 entries=2 op=nft_register_chain pid=2359 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.090000 audit[2359]: SYSCALL arch=c000003e syscall=46 success=yes exit=132 a0=3 a1=7ffdd92a81c0 a2=0 a3=7ffdd92a81ac items=0 ppid=2299 pid=2359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.090000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 08:17:35.092000 audit[2361]: NETFILTER_CFG table=filter:48 family=10 entries=1 op=nft_register_rule pid=2361 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.092000 audit[2361]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7fff13736400 a2=0 a3=7fff137363ec items=0 ppid=2299 pid=2361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.092000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 13 08:17:35.092000 audit[2362]: NETFILTER_CFG table=nat:49 family=10 entries=1 op=nft_register_chain pid=2362 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.092000 audit[2362]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffea2c63680 a2=0 a3=7ffea2c6366c items=0 ppid=2299 pid=2362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.092000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 13 08:17:35.093000 audit[2363]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2363 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.093000 audit[2363]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff64caffa0 a2=0 a3=7fff64caff8c items=0 ppid=2299 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.093000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 08:17:35.094000 audit[2365]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_rule pid=2365 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.094000 audit[2365]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd881f8980 a2=0 a3=7ffd881f896c items=0 ppid=2299 pid=2365 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.094000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 13 08:17:35.095000 audit[2367]: NETFILTER_CFG table=nat:52 family=10 entries=2 op=nft_register_chain pid=2367 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.095000 audit[2367]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7fffae63fae0 a2=0 a3=7fffae63facc items=0 ppid=2299 pid=2367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.095000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 08:17:35.096000 audit[2369]: NETFILTER_CFG table=nat:53 family=10 entries=1 op=nft_register_rule pid=2369 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.096000 audit[2369]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7ffc625fcba0 a2=0 a3=7ffc625fcb8c items=0 ppid=2299 pid=2369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.096000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 13 08:17:35.097000 audit[2371]: NETFILTER_CFG table=nat:54 family=10 entries=1 op=nft_register_rule pid=2371 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.097000 audit[2371]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffe1722fb60 a2=0 a3=7ffe1722fb4c items=0 ppid=2299 pid=2371 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.097000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 13 08:17:35.098000 audit[2373]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_rule pid=2373 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.098000 audit[2373]: SYSCALL arch=c000003e syscall=46 success=yes exit=556 a0=3 a1=7ffcc74c32d0 a2=0 a3=7ffcc74c32bc items=0 ppid=2299 pid=2373 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.098000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 13 08:17:35.100292 kubelet[2299]: I0213 08:17:35.100252 2299 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 08:17:35.100292 kubelet[2299]: I0213 08:17:35.100262 2299 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 13 08:17:35.100292 kubelet[2299]: I0213 08:17:35.100274 2299 kubelet.go:2113] "Starting kubelet main sync loop" Feb 13 08:17:35.100342 kubelet[2299]: E0213 08:17:35.100300 2299 kubelet.go:2137] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 13 08:17:35.100572 kubelet[2299]: W0213 08:17:35.100512 2299 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://145.40.67.79:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.100572 kubelet[2299]: E0213 08:17:35.100543 2299 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://145.40.67.79:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.099000 audit[2374]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=2374 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.099000 audit[2374]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffff84f0dc0 a2=0 a3=7ffff84f0dac items=0 ppid=2299 pid=2374 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.099000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 08:17:35.100000 audit[2375]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=2375 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.100000 audit[2375]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda34cb180 a2=0 a3=7ffda34cb16c items=0 ppid=2299 pid=2375 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.100000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 08:17:35.100000 audit[2376]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=2376 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:35.100000 audit[2376]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffee16028c0 a2=0 a3=7ffee16028ac items=0 ppid=2299 pid=2376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:35.100000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 08:17:35.168608 kubelet[2299]: I0213 08:17:35.168546 2299 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.169380 kubelet[2299]: E0213 08:17:35.169328 2299 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.79:6443/api/v1/nodes\": dial tcp 145.40.67.79:6443: connect: connection refused" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.200923 kubelet[2299]: I0213 08:17:35.200818 2299 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:35.204550 kubelet[2299]: I0213 08:17:35.204475 2299 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:35.207919 kubelet[2299]: I0213 08:17:35.207840 2299 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:35.208718 kubelet[2299]: I0213 08:17:35.208638 2299 status_manager.go:698] "Failed to get status for pod" podUID=e80f78450fc8c0836f40aa3407ae0794 pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" err="Get \"https://145.40.67.79:6443/api/v1/namespaces/kube-system/pods/kube-apiserver-ci-3510.3.2-a-56b02fc11a\": dial tcp 145.40.67.79:6443: connect: connection refused" Feb 13 08:17:35.211683 kubelet[2299]: I0213 08:17:35.211636 2299 status_manager.go:698] "Failed to get status for pod" podUID=52811d8b9ad9a19a787d7f637ccab430 pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" err="Get \"https://145.40.67.79:6443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ci-3510.3.2-a-56b02fc11a\": dial tcp 145.40.67.79:6443: connect: connection refused" Feb 13 08:17:35.215095 kubelet[2299]: I0213 08:17:35.215046 2299 status_manager.go:698] "Failed to get status for pod" podUID=ab6b995260b3aba724e2153643585c97 pod="kube-system/kube-scheduler-ci-3510.3.2-a-56b02fc11a" err="Get \"https://145.40.67.79:6443/api/v1/namespaces/kube-system/pods/kube-scheduler-ci-3510.3.2-a-56b02fc11a\": dial tcp 145.40.67.79:6443: connect: connection refused" Feb 13 08:17:35.262046 kubelet[2299]: I0213 08:17:35.261928 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262046 kubelet[2299]: I0213 08:17:35.262038 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262409 kubelet[2299]: I0213 08:17:35.262111 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262409 kubelet[2299]: I0213 08:17:35.262246 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e80f78450fc8c0836f40aa3407ae0794-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" (UID: \"e80f78450fc8c0836f40aa3407ae0794\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262409 kubelet[2299]: I0213 08:17:35.262344 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262409 kubelet[2299]: I0213 08:17:35.262412 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262733 kubelet[2299]: I0213 08:17:35.262486 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ab6b995260b3aba724e2153643585c97-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-56b02fc11a\" (UID: \"ab6b995260b3aba724e2153643585c97\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262733 kubelet[2299]: I0213 08:17:35.262550 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e80f78450fc8c0836f40aa3407ae0794-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" (UID: \"e80f78450fc8c0836f40aa3407ae0794\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.262733 kubelet[2299]: I0213 08:17:35.262617 2299 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e80f78450fc8c0836f40aa3407ae0794-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" (UID: \"e80f78450fc8c0836f40aa3407ae0794\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.462589 kubelet[2299]: E0213 08:17:35.462342 2299 controller.go:146] failed to ensure lease exists, will retry in 800ms, error: Get "https://145.40.67.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-56b02fc11a?timeout=10s": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.514537 env[1563]: time="2024-02-13T08:17:35.514459673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-56b02fc11a,Uid:e80f78450fc8c0836f40aa3407ae0794,Namespace:kube-system,Attempt:0,}" Feb 13 08:17:35.519528 env[1563]: time="2024-02-13T08:17:35.519443594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-56b02fc11a,Uid:52811d8b9ad9a19a787d7f637ccab430,Namespace:kube-system,Attempt:0,}" Feb 13 08:17:35.521388 env[1563]: time="2024-02-13T08:17:35.521280339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-56b02fc11a,Uid:ab6b995260b3aba724e2153643585c97,Namespace:kube-system,Attempt:0,}" Feb 13 08:17:35.573585 kubelet[2299]: I0213 08:17:35.573533 2299 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.574259 kubelet[2299]: E0213 08:17:35.574177 2299 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.79:6443/api/v1/nodes\": dial tcp 145.40.67.79:6443: connect: connection refused" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:35.802266 kubelet[2299]: W0213 08:17:35.802129 2299 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://145.40.67.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.802266 kubelet[2299]: E0213 08:17:35.802248 2299 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://145.40.67.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.904478 kubelet[2299]: W0213 08:17:35.904296 2299 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://145.40.67.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-56b02fc11a&limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:35.904478 kubelet[2299]: E0213 08:17:35.904452 2299 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://145.40.67.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-56b02fc11a&limit=500&resourceVersion=0": dial tcp 145.40.67.79:6443: connect: connection refused Feb 13 08:17:36.052477 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount871694506.mount: Deactivated successfully. Feb 13 08:17:36.053273 env[1563]: time="2024-02-13T08:17:36.053233072Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.054418 env[1563]: time="2024-02-13T08:17:36.054403584Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.055040 env[1563]: time="2024-02-13T08:17:36.055000812Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.055953 env[1563]: time="2024-02-13T08:17:36.055941369Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.056346 env[1563]: time="2024-02-13T08:17:36.056334847Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.057704 env[1563]: time="2024-02-13T08:17:36.057688701Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.059554 env[1563]: time="2024-02-13T08:17:36.059507860Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.059976 env[1563]: time="2024-02-13T08:17:36.059942359Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.061506 env[1563]: time="2024-02-13T08:17:36.061470455Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.062273 env[1563]: time="2024-02-13T08:17:36.062236186Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.063435 env[1563]: time="2024-02-13T08:17:36.063399964Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.063859 env[1563]: time="2024-02-13T08:17:36.063808074Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:36.069100 env[1563]: time="2024-02-13T08:17:36.068997972Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:17:36.069100 env[1563]: time="2024-02-13T08:17:36.069060155Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:17:36.069100 env[1563]: time="2024-02-13T08:17:36.069071946Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:17:36.069310 env[1563]: time="2024-02-13T08:17:36.069251877Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/9dbfdb25a9fa9a7cee98335f58591b8c9ef5b3af6003c5faa979c0c00f0d8ee2 pid=2386 runtime=io.containerd.runc.v2 Feb 13 08:17:36.071928 env[1563]: time="2024-02-13T08:17:36.071895189Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:17:36.071928 env[1563]: time="2024-02-13T08:17:36.071915741Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:17:36.071928 env[1563]: time="2024-02-13T08:17:36.071922736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:17:36.072061 env[1563]: time="2024-02-13T08:17:36.071986026Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/4b36b0cd595185d3237d7b67249f9f649ed5152b369889fd9ac7f76c1d62c02b pid=2409 runtime=io.containerd.runc.v2 Feb 13 08:17:36.072061 env[1563]: time="2024-02-13T08:17:36.072046247Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:17:36.072128 env[1563]: time="2024-02-13T08:17:36.072061783Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:17:36.072128 env[1563]: time="2024-02-13T08:17:36.072068135Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:17:36.072180 env[1563]: time="2024-02-13T08:17:36.072136276Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/9cfb2d93b0e43ca488a333040c0a2858005692ca3e7b2485bf743cd45e7df9ef pid=2417 runtime=io.containerd.runc.v2 Feb 13 08:17:36.074743 kubelet[2299]: E0213 08:17:36.074684 2299 event.go:276] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-56b02fc11a.17b35e3289f9d726", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-56b02fc11a", UID:"ci-3510.3.2-a-56b02fc11a", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-56b02fc11a"}, FirstTimestamp:time.Date(2024, time.February, 13, 8, 17, 34, 859777830, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 8, 17, 34, 859777830, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'Post "https://145.40.67.79:6443/api/v1/namespaces/default/events": dial tcp 145.40.67.79:6443: connect: connection refused'(may retry after sleeping) Feb 13 08:17:36.097643 env[1563]: time="2024-02-13T08:17:36.097620526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-56b02fc11a,Uid:e80f78450fc8c0836f40aa3407ae0794,Namespace:kube-system,Attempt:0,} returns sandbox id \"9dbfdb25a9fa9a7cee98335f58591b8c9ef5b3af6003c5faa979c0c00f0d8ee2\"" Feb 13 08:17:36.099161 env[1563]: time="2024-02-13T08:17:36.099145580Z" level=info msg="CreateContainer within sandbox \"9dbfdb25a9fa9a7cee98335f58591b8c9ef5b3af6003c5faa979c0c00f0d8ee2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 08:17:36.100492 env[1563]: time="2024-02-13T08:17:36.100474369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-56b02fc11a,Uid:52811d8b9ad9a19a787d7f637ccab430,Namespace:kube-system,Attempt:0,} returns sandbox id \"9cfb2d93b0e43ca488a333040c0a2858005692ca3e7b2485bf743cd45e7df9ef\"" Feb 13 08:17:36.100666 env[1563]: time="2024-02-13T08:17:36.100653207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-56b02fc11a,Uid:ab6b995260b3aba724e2153643585c97,Namespace:kube-system,Attempt:0,} returns sandbox id \"4b36b0cd595185d3237d7b67249f9f649ed5152b369889fd9ac7f76c1d62c02b\"" Feb 13 08:17:36.101572 env[1563]: time="2024-02-13T08:17:36.101552662Z" level=info msg="CreateContainer within sandbox \"4b36b0cd595185d3237d7b67249f9f649ed5152b369889fd9ac7f76c1d62c02b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 08:17:36.101651 env[1563]: time="2024-02-13T08:17:36.101638017Z" level=info msg="CreateContainer within sandbox \"9cfb2d93b0e43ca488a333040c0a2858005692ca3e7b2485bf743cd45e7df9ef\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 08:17:36.105425 env[1563]: time="2024-02-13T08:17:36.105383622Z" level=info msg="CreateContainer within sandbox \"9dbfdb25a9fa9a7cee98335f58591b8c9ef5b3af6003c5faa979c0c00f0d8ee2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fdce3fe01f82d0c47ace28ab099ba31b4d0387eb49993912e62c431f75d7abb0\"" Feb 13 08:17:36.105653 env[1563]: time="2024-02-13T08:17:36.105638249Z" level=info msg="StartContainer for \"fdce3fe01f82d0c47ace28ab099ba31b4d0387eb49993912e62c431f75d7abb0\"" Feb 13 08:17:36.106843 env[1563]: time="2024-02-13T08:17:36.106811100Z" level=info msg="CreateContainer within sandbox \"4b36b0cd595185d3237d7b67249f9f649ed5152b369889fd9ac7f76c1d62c02b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5917ee417b6701b8246f6e7a8fe02cc214b77f65c7da0b4df6b28e59af7c802c\"" Feb 13 08:17:36.106967 env[1563]: time="2024-02-13T08:17:36.106951702Z" level=info msg="StartContainer for \"5917ee417b6701b8246f6e7a8fe02cc214b77f65c7da0b4df6b28e59af7c802c\"" Feb 13 08:17:36.107724 env[1563]: time="2024-02-13T08:17:36.107698751Z" level=info msg="CreateContainer within sandbox \"9cfb2d93b0e43ca488a333040c0a2858005692ca3e7b2485bf743cd45e7df9ef\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"37a689dd23b184fdc5714c3651de93906f49378c4d0a6257cb3754cd6253a98a\"" Feb 13 08:17:36.108108 env[1563]: time="2024-02-13T08:17:36.108095311Z" level=info msg="StartContainer for \"37a689dd23b184fdc5714c3651de93906f49378c4d0a6257cb3754cd6253a98a\"" Feb 13 08:17:36.139069 env[1563]: time="2024-02-13T08:17:36.139034810Z" level=info msg="StartContainer for \"fdce3fe01f82d0c47ace28ab099ba31b4d0387eb49993912e62c431f75d7abb0\" returns successfully" Feb 13 08:17:36.139190 env[1563]: time="2024-02-13T08:17:36.139157556Z" level=info msg="StartContainer for \"5917ee417b6701b8246f6e7a8fe02cc214b77f65c7da0b4df6b28e59af7c802c\" returns successfully" Feb 13 08:17:36.139910 env[1563]: time="2024-02-13T08:17:36.139896799Z" level=info msg="StartContainer for \"37a689dd23b184fdc5714c3651de93906f49378c4d0a6257cb3754cd6253a98a\" returns successfully" Feb 13 08:17:36.375597 kubelet[2299]: I0213 08:17:36.375504 2299 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:36.983798 kubelet[2299]: E0213 08:17:36.983761 2299 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3510.3.2-a-56b02fc11a\" not found" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:37.084414 kubelet[2299]: I0213 08:17:37.084356 2299 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:37.104519 kubelet[2299]: E0213 08:17:37.104477 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.205037 kubelet[2299]: E0213 08:17:37.204935 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.306101 kubelet[2299]: E0213 08:17:37.305984 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.406915 kubelet[2299]: E0213 08:17:37.406816 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.508103 kubelet[2299]: E0213 08:17:37.507982 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.608792 kubelet[2299]: E0213 08:17:37.608586 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.709378 kubelet[2299]: E0213 08:17:37.709265 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.809484 kubelet[2299]: E0213 08:17:37.809387 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:37.910705 kubelet[2299]: E0213 08:17:37.910536 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:38.011649 kubelet[2299]: E0213 08:17:38.011549 2299 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-56b02fc11a\" not found" Feb 13 08:17:38.862384 kubelet[2299]: I0213 08:17:38.862301 2299 apiserver.go:52] "Watching apiserver" Feb 13 08:17:39.362173 kubelet[2299]: I0213 08:17:39.362065 2299 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 08:17:39.393746 kubelet[2299]: I0213 08:17:39.393623 2299 reconciler.go:41] "Reconciler: start to sync state" Feb 13 08:17:39.469013 kubelet[2299]: E0213 08:17:39.468935 2299 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:39.668489 kubelet[2299]: E0213 08:17:39.668316 2299 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.278909 systemd[1]: Reloading. Feb 13 08:17:40.307069 /usr/lib/systemd/system-generators/torcx-generator[2672]: time="2024-02-13T08:17:40Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:17:40.307092 /usr/lib/systemd/system-generators/torcx-generator[2672]: time="2024-02-13T08:17:40Z" level=info msg="torcx already run" Feb 13 08:17:40.382294 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:17:40.382306 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:17:40.397122 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:17:40.456627 systemd[1]: Stopping kubelet.service... Feb 13 08:17:40.470316 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 08:17:40.470471 systemd[1]: Stopped kubelet.service. Feb 13 08:17:40.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:40.471394 systemd[1]: Started kubelet.service. Feb 13 08:17:40.493911 kubelet[2738]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:17:40.493911 kubelet[2738]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:17:40.494123 kubelet[2738]: I0213 08:17:40.493907 2738 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 08:17:40.494663 kubelet[2738]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:17:40.494663 kubelet[2738]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:17:40.496371 kubelet[2738]: I0213 08:17:40.496334 2738 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 13 08:17:40.496371 kubelet[2738]: I0213 08:17:40.496344 2738 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 08:17:40.496496 kubelet[2738]: I0213 08:17:40.496462 2738 server.go:836] "Client rotation is on, will bootstrap in background" Feb 13 08:17:40.497210 kernel: kauditd_printk_skb: 108 callbacks suppressed Feb 13 08:17:40.497246 kernel: audit: type=1131 audit(1707812260.469:219): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:40.497314 kubelet[2738]: I0213 08:17:40.497278 2738 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 08:17:40.497781 kubelet[2738]: I0213 08:17:40.497762 2738 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 08:17:40.470000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:40.625629 kernel: audit: type=1130 audit(1707812260.470:220): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:40.630594 kubelet[2738]: I0213 08:17:40.630578 2738 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 08:17:40.630830 kubelet[2738]: I0213 08:17:40.630800 2738 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 08:17:40.630856 kubelet[2738]: I0213 08:17:40.630842 2738 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:cgroupfs KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 08:17:40.630856 kubelet[2738]: I0213 08:17:40.630853 2738 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 08:17:40.630934 kubelet[2738]: I0213 08:17:40.630860 2738 container_manager_linux.go:308] "Creating device plugin manager" Feb 13 08:17:40.630934 kubelet[2738]: I0213 08:17:40.630879 2738 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:17:40.632423 kubelet[2738]: I0213 08:17:40.632412 2738 kubelet.go:398] "Attempting to sync node with API server" Feb 13 08:17:40.632471 kubelet[2738]: I0213 08:17:40.632428 2738 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 08:17:40.632471 kubelet[2738]: I0213 08:17:40.632446 2738 kubelet.go:297] "Adding apiserver pod source" Feb 13 08:17:40.632471 kubelet[2738]: I0213 08:17:40.632459 2738 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 08:17:40.632755 kubelet[2738]: I0213 08:17:40.632745 2738 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 08:17:40.633050 kubelet[2738]: I0213 08:17:40.633040 2738 server.go:1186] "Started kubelet" Feb 13 08:17:40.633089 kubelet[2738]: I0213 08:17:40.633079 2738 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 08:17:40.633720 kubelet[2738]: I0213 08:17:40.633712 2738 server.go:451] "Adding debug handlers to kubelet server" Feb 13 08:17:40.632000 audit[2738]: AVC avc: denied { mac_admin } for pid=2738 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:40.633970 kubelet[2738]: I0213 08:17:40.633954 2738 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 08:17:40.634004 kubelet[2738]: I0213 08:17:40.633972 2738 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 08:17:40.634004 kubelet[2738]: I0213 08:17:40.633985 2738 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 08:17:40.634182 kubelet[2738]: I0213 08:17:40.634170 2738 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 13 08:17:40.634223 kubelet[2738]: I0213 08:17:40.634210 2738 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 08:17:40.636617 kubelet[2738]: E0213 08:17:40.636599 2738 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 08:17:40.636727 kubelet[2738]: E0213 08:17:40.636716 2738 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 08:17:40.648852 kubelet[2738]: I0213 08:17:40.648836 2738 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 08:17:40.655086 kubelet[2738]: I0213 08:17:40.655073 2738 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 08:17:40.655086 kubelet[2738]: I0213 08:17:40.655085 2738 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 13 08:17:40.655181 kubelet[2738]: I0213 08:17:40.655101 2738 kubelet.go:2113] "Starting kubelet main sync loop" Feb 13 08:17:40.655181 kubelet[2738]: E0213 08:17:40.655136 2738 kubelet.go:2137] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 08:17:40.673543 kubelet[2738]: I0213 08:17:40.673486 2738 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 08:17:40.673621 kubelet[2738]: I0213 08:17:40.673563 2738 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 08:17:40.673658 kubelet[2738]: I0213 08:17:40.673629 2738 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:17:40.673772 kubelet[2738]: I0213 08:17:40.673742 2738 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 08:17:40.673772 kubelet[2738]: I0213 08:17:40.673754 2738 state_mem.go:96] "Updated CPUSet assignments" assignments=map[] Feb 13 08:17:40.673772 kubelet[2738]: I0213 08:17:40.673760 2738 policy_none.go:49] "None policy: Start" Feb 13 08:17:40.674070 kubelet[2738]: I0213 08:17:40.674030 2738 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 08:17:40.674070 kubelet[2738]: I0213 08:17:40.674043 2738 state_mem.go:35] "Initializing new in-memory state store" Feb 13 08:17:40.674131 kubelet[2738]: I0213 08:17:40.674114 2738 state_mem.go:75] "Updated machine memory state" Feb 13 08:17:40.674725 kubelet[2738]: I0213 08:17:40.674719 2738 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 08:17:40.674753 kubelet[2738]: I0213 08:17:40.674748 2738 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 08:17:40.674861 kubelet[2738]: I0213 08:17:40.674856 2738 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 08:17:40.632000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:40.730556 kernel: audit: type=1400 audit(1707812260.632:221): avc: denied { mac_admin } for pid=2738 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:40.730624 kernel: audit: type=1401 audit(1707812260.632:221): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:40.730637 kernel: audit: type=1300 audit(1707812260.632:221): arch=c000003e syscall=188 success=no exit=-22 a0=c0002d8e40 a1=c000326c00 a2=c0002d8e10 a3=25 items=0 ppid=1 pid=2738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:40.632000 audit[2738]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0002d8e40 a1=c000326c00 a2=c0002d8e10 a3=25 items=0 ppid=1 pid=2738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:40.736430 kubelet[2738]: I0213 08:17:40.736391 2738 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.741588 kubelet[2738]: I0213 08:17:40.741574 2738 kubelet_node_status.go:108] "Node was previously registered" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.741647 kubelet[2738]: I0213 08:17:40.741628 2738 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.755198 kubelet[2738]: I0213 08:17:40.755183 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:40.755282 kubelet[2738]: I0213 08:17:40.755222 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:40.755282 kubelet[2738]: I0213 08:17:40.755240 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:40.758766 kubelet[2738]: E0213 08:17:40.758754 2738 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.632000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:40.836357 kubelet[2738]: E0213 08:17:40.836345 2738 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.917722 kernel: audit: type=1327 audit(1707812260.632:221): proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:40.917818 kernel: audit: type=1400 audit(1707812260.632:222): avc: denied { mac_admin } for pid=2738 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:40.632000 audit[2738]: AVC avc: denied { mac_admin } for pid=2738 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:40.936088 kubelet[2738]: I0213 08:17:40.936067 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e80f78450fc8c0836f40aa3407ae0794-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" (UID: \"e80f78450fc8c0836f40aa3407ae0794\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936088 kubelet[2738]: I0213 08:17:40.936095 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936232 kubelet[2738]: I0213 08:17:40.936109 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936232 kubelet[2738]: I0213 08:17:40.936122 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936232 kubelet[2738]: I0213 08:17:40.936158 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e80f78450fc8c0836f40aa3407ae0794-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" (UID: \"e80f78450fc8c0836f40aa3407ae0794\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936232 kubelet[2738]: I0213 08:17:40.936176 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e80f78450fc8c0836f40aa3407ae0794-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" (UID: \"e80f78450fc8c0836f40aa3407ae0794\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936232 kubelet[2738]: I0213 08:17:40.936199 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936378 kubelet[2738]: I0213 08:17:40.936226 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/52811d8b9ad9a19a787d7f637ccab430-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" (UID: \"52811d8b9ad9a19a787d7f637ccab430\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.936378 kubelet[2738]: I0213 08:17:40.936239 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ab6b995260b3aba724e2153643585c97-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-56b02fc11a\" (UID: \"ab6b995260b3aba724e2153643585c97\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:40.981143 kernel: audit: type=1401 audit(1707812260.632:222): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:40.632000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:41.013465 kernel: audit: type=1300 audit(1707812260.632:222): arch=c000003e syscall=188 success=no exit=-22 a0=c000527a20 a1=c000326c18 a2=c0002d8ed0 a3=25 items=0 ppid=1 pid=2738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:40.632000 audit[2738]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000527a20 a1=c000326c18 a2=c0002d8ed0 a3=25 items=0 ppid=1 pid=2738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:41.036333 kubelet[2738]: E0213 08:17:41.036283 2738 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-scheduler-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:41.107920 kernel: audit: type=1327 audit(1707812260.632:222): proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:40.632000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:40.673000 audit[2738]: AVC avc: denied { mac_admin } for pid=2738 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:17:40.673000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:17:40.673000 audit[2738]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c001941200 a1=c0014b9128 a2=c0019411d0 a3=25 items=0 ppid=1 pid=2738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:40.673000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:17:41.633242 kubelet[2738]: I0213 08:17:41.633162 2738 apiserver.go:52] "Watching apiserver" Feb 13 08:17:41.735384 kubelet[2738]: I0213 08:17:41.735293 2738 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 08:17:41.740403 kubelet[2738]: I0213 08:17:41.740329 2738 reconciler.go:41] "Reconciler: start to sync state" Feb 13 08:17:42.038278 kubelet[2738]: E0213 08:17:42.038188 2738 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:42.241420 kubelet[2738]: E0213 08:17:42.241322 2738 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-3510.3.2-a-56b02fc11a\" already exists" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" Feb 13 08:17:42.465967 kubelet[2738]: I0213 08:17:42.465777 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510.3.2-a-56b02fc11a" podStartSLOduration=4.465682365 pod.CreationTimestamp="2024-02-13 08:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:17:42.465579708 +0000 UTC m=+1.992488833" watchObservedRunningTime="2024-02-13 08:17:42.465682365 +0000 UTC m=+1.992591479" Feb 13 08:17:42.836981 kubelet[2738]: I0213 08:17:42.836956 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510.3.2-a-56b02fc11a" podStartSLOduration=4.836922326 pod.CreationTimestamp="2024-02-13 08:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:17:42.83673318 +0000 UTC m=+2.363642247" watchObservedRunningTime="2024-02-13 08:17:42.836922326 +0000 UTC m=+2.363831390" Feb 13 08:17:43.236050 kubelet[2738]: I0213 08:17:43.235989 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-56b02fc11a" podStartSLOduration=5.23596521 pod.CreationTimestamp="2024-02-13 08:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:17:43.235811872 +0000 UTC m=+2.762720945" watchObservedRunningTime="2024-02-13 08:17:43.23596521 +0000 UTC m=+2.762874275" Feb 13 08:17:45.866758 sudo[1755]: pam_unix(sudo:session): session closed for user root Feb 13 08:17:45.865000 audit[1755]: USER_END pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:45.867643 sshd[1751]: pam_unix(sshd:session): session closed for user core Feb 13 08:17:45.869210 systemd[1]: sshd@6-145.40.67.79:22-139.178.68.195:46210.service: Deactivated successfully. Feb 13 08:17:45.869916 systemd-logind[1548]: Session 9 logged out. Waiting for processes to exit. Feb 13 08:17:45.869948 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 08:17:45.870667 systemd-logind[1548]: Removed session 9. Feb 13 08:17:45.893063 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:17:45.893098 kernel: audit: type=1106 audit(1707812265.865:224): pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:45.865000 audit[1755]: CRED_DISP pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:46.067116 kernel: audit: type=1104 audit(1707812265.865:225): pid=1755 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:17:46.067148 kernel: audit: type=1106 audit(1707812265.867:226): pid=1751 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:45.867000 audit[1751]: USER_END pid=1751 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:46.161430 kernel: audit: type=1104 audit(1707812265.867:227): pid=1751 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:45.867000 audit[1751]: CRED_DISP pid=1751 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:17:46.249343 kernel: audit: type=1131 audit(1707812265.868:228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.67.79:22-139.178.68.195:46210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:45.868000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.67.79:22-139.178.68.195:46210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:17:52.165190 update_engine[1550]: I0213 08:17:52.165118 1550 update_attempter.cc:509] Updating boot flags... Feb 13 08:17:53.196495 kubelet[2738]: I0213 08:17:53.196478 2738 kuberuntime_manager.go:1114] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 08:17:53.196744 env[1563]: time="2024-02-13T08:17:53.196660132Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 08:17:53.196876 kubelet[2738]: I0213 08:17:53.196754 2738 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 08:17:53.908270 kubelet[2738]: I0213 08:17:53.908207 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:54.014497 kubelet[2738]: I0213 08:17:54.014383 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4q6\" (UniqueName: \"kubernetes.io/projected/62aea10c-fd29-462d-8c87-6aed3868b8b7-kube-api-access-lk4q6\") pod \"kube-proxy-gq6jf\" (UID: \"62aea10c-fd29-462d-8c87-6aed3868b8b7\") " pod="kube-system/kube-proxy-gq6jf" Feb 13 08:17:54.014497 kubelet[2738]: I0213 08:17:54.014502 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/62aea10c-fd29-462d-8c87-6aed3868b8b7-xtables-lock\") pod \"kube-proxy-gq6jf\" (UID: \"62aea10c-fd29-462d-8c87-6aed3868b8b7\") " pod="kube-system/kube-proxy-gq6jf" Feb 13 08:17:54.014907 kubelet[2738]: I0213 08:17:54.014575 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/62aea10c-fd29-462d-8c87-6aed3868b8b7-kube-proxy\") pod \"kube-proxy-gq6jf\" (UID: \"62aea10c-fd29-462d-8c87-6aed3868b8b7\") " pod="kube-system/kube-proxy-gq6jf" Feb 13 08:17:54.014907 kubelet[2738]: I0213 08:17:54.014644 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/62aea10c-fd29-462d-8c87-6aed3868b8b7-lib-modules\") pod \"kube-proxy-gq6jf\" (UID: \"62aea10c-fd29-462d-8c87-6aed3868b8b7\") " pod="kube-system/kube-proxy-gq6jf" Feb 13 08:17:54.215182 env[1563]: time="2024-02-13T08:17:54.214933183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gq6jf,Uid:62aea10c-fd29-462d-8c87-6aed3868b8b7,Namespace:kube-system,Attempt:0,}" Feb 13 08:17:54.239244 env[1563]: time="2024-02-13T08:17:54.238971255Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:17:54.239486 env[1563]: time="2024-02-13T08:17:54.239182671Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:17:54.239486 env[1563]: time="2024-02-13T08:17:54.239243095Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:17:54.239849 env[1563]: time="2024-02-13T08:17:54.239700942Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/7e566dcda40e9b7e6e07d36d5d86e29486661cd529f4be9be0d77cd9f5577399 pid=2948 runtime=io.containerd.runc.v2 Feb 13 08:17:54.307592 env[1563]: time="2024-02-13T08:17:54.307519847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gq6jf,Uid:62aea10c-fd29-462d-8c87-6aed3868b8b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"7e566dcda40e9b7e6e07d36d5d86e29486661cd529f4be9be0d77cd9f5577399\"" Feb 13 08:17:54.311450 env[1563]: time="2024-02-13T08:17:54.311386818Z" level=info msg="CreateContainer within sandbox \"7e566dcda40e9b7e6e07d36d5d86e29486661cd529f4be9be0d77cd9f5577399\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 08:17:54.321633 kubelet[2738]: I0213 08:17:54.321595 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:54.322242 env[1563]: time="2024-02-13T08:17:54.322132049Z" level=info msg="CreateContainer within sandbox \"7e566dcda40e9b7e6e07d36d5d86e29486661cd529f4be9be0d77cd9f5577399\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7917cf214d366281e2b0883c3f8ed8d10077874b7a95c47aef259fa34e4b1ad2\"" Feb 13 08:17:54.322894 env[1563]: time="2024-02-13T08:17:54.322841655Z" level=info msg="StartContainer for \"7917cf214d366281e2b0883c3f8ed8d10077874b7a95c47aef259fa34e4b1ad2\"" Feb 13 08:17:54.375177 env[1563]: time="2024-02-13T08:17:54.375117216Z" level=info msg="StartContainer for \"7917cf214d366281e2b0883c3f8ed8d10077874b7a95c47aef259fa34e4b1ad2\" returns successfully" Feb 13 08:17:54.414000 audit[3047]: NETFILTER_CFG table=mangle:59 family=2 entries=1 op=nft_register_chain pid=3047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.416922 kubelet[2738]: I0213 08:17:54.416752 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfsrk\" (UniqueName: \"kubernetes.io/projected/61acb94e-d8b9-4841-8581-3eadbca84f94-kube-api-access-jfsrk\") pod \"tigera-operator-cfc98749c-qckdw\" (UID: \"61acb94e-d8b9-4841-8581-3eadbca84f94\") " pod="tigera-operator/tigera-operator-cfc98749c-qckdw" Feb 13 08:17:54.416922 kubelet[2738]: I0213 08:17:54.416829 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/61acb94e-d8b9-4841-8581-3eadbca84f94-var-lib-calico\") pod \"tigera-operator-cfc98749c-qckdw\" (UID: \"61acb94e-d8b9-4841-8581-3eadbca84f94\") " pod="tigera-operator/tigera-operator-cfc98749c-qckdw" Feb 13 08:17:54.414000 audit[3047]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc63120a00 a2=0 a3=7ffc631209ec items=0 ppid=3000 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.568233 kernel: audit: type=1325 audit(1707812274.414:229): table=mangle:59 family=2 entries=1 op=nft_register_chain pid=3047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.568293 kernel: audit: type=1300 audit(1707812274.414:229): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc63120a00 a2=0 a3=7ffc631209ec items=0 ppid=3000 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.568309 kernel: audit: type=1327 audit(1707812274.414:229): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 08:17:54.414000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 08:17:54.415000 audit[3048]: NETFILTER_CFG table=mangle:60 family=10 entries=1 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:54.626055 env[1563]: time="2024-02-13T08:17:54.625985867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-qckdw,Uid:61acb94e-d8b9-4841-8581-3eadbca84f94,Namespace:tigera-operator,Attempt:0,}" Feb 13 08:17:54.631897 env[1563]: time="2024-02-13T08:17:54.631842974Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:17:54.631897 env[1563]: time="2024-02-13T08:17:54.631863195Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:17:54.631897 env[1563]: time="2024-02-13T08:17:54.631870119Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:17:54.631987 env[1563]: time="2024-02-13T08:17:54.631930572Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5faea167b4860c5427ec67f287a3748586a796844f9778eae4b19c1cb9a501cc pid=3082 runtime=io.containerd.runc.v2 Feb 13 08:17:54.683885 kernel: audit: type=1325 audit(1707812274.415:230): table=mangle:60 family=10 entries=1 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:54.683940 kernel: audit: type=1300 audit(1707812274.415:230): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffed5b54310 a2=0 a3=7ffed5b542fc items=0 ppid=3000 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.415000 audit[3048]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffed5b54310 a2=0 a3=7ffed5b542fc items=0 ppid=3000 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.692024 kubelet[2738]: I0213 08:17:54.691998 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-gq6jf" podStartSLOduration=1.69196378 pod.CreationTimestamp="2024-02-13 08:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:17:54.691661861 +0000 UTC m=+14.218570928" watchObservedRunningTime="2024-02-13 08:17:54.69196378 +0000 UTC m=+14.218872843" Feb 13 08:17:54.415000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 08:17:54.838273 kernel: audit: type=1327 audit(1707812274.415:230): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 08:17:54.838307 kernel: audit: type=1325 audit(1707812274.415:231): table=nat:61 family=2 entries=1 op=nft_register_chain pid=3049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.415000 audit[3049]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=3049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.895889 kernel: audit: type=1300 audit(1707812274.415:231): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4cc1e910 a2=0 a3=7fff4cc1e8fc items=0 ppid=3000 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.415000 audit[3049]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4cc1e910 a2=0 a3=7fff4cc1e8fc items=0 ppid=3000 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.992413 kernel: audit: type=1327 audit(1707812274.415:231): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 08:17:54.415000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 08:17:55.050280 kernel: audit: type=1325 audit(1707812274.416:232): table=nat:62 family=10 entries=1 op=nft_register_chain pid=3050 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:54.416000 audit[3050]: NETFILTER_CFG table=nat:62 family=10 entries=1 op=nft_register_chain pid=3050 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:54.416000 audit[3050]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd6e08e930 a2=0 a3=7ffd6e08e91c items=0 ppid=3000 pid=3050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.416000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 08:17:54.417000 audit[3053]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.417000 audit[3053]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffeca0a2b50 a2=0 a3=7ffeca0a2b3c items=0 ppid=3000 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.417000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 08:17:54.417000 audit[3054]: NETFILTER_CFG table=filter:64 family=10 entries=1 op=nft_register_chain pid=3054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:54.417000 audit[3054]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc41cea950 a2=0 a3=7ffc41cea93c items=0 ppid=3000 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.417000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 08:17:54.522000 audit[3056]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.522000 audit[3056]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fff4b6aa2b0 a2=0 a3=7fff4b6aa29c items=0 ppid=3000 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.522000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 08:17:54.523000 audit[3058]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3058 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.523000 audit[3058]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc7664bac0 a2=0 a3=7ffc7664baac items=0 ppid=3000 pid=3058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.523000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Feb 13 08:17:54.525000 audit[3061]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3061 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.525000 audit[3061]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe0c084a50 a2=0 a3=7ffe0c084a3c items=0 ppid=3000 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.525000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Feb 13 08:17:54.526000 audit[3062]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3062 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.526000 audit[3062]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd28c13e0 a2=0 a3=7ffcd28c13cc items=0 ppid=3000 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.526000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 08:17:54.527000 audit[3064]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.527000 audit[3064]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe31e70af0 a2=0 a3=7ffe31e70adc items=0 ppid=3000 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.527000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 08:17:54.528000 audit[3065]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.528000 audit[3065]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe89de1160 a2=0 a3=7ffe89de114c items=0 ppid=3000 pid=3065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.528000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 08:17:54.529000 audit[3067]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3067 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.529000 audit[3067]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd83d8ebe0 a2=0 a3=7ffd83d8ebcc items=0 ppid=3000 pid=3067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.529000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 08:17:54.531000 audit[3070]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3070 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.531000 audit[3070]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fffec78b6c0 a2=0 a3=7fffec78b6ac items=0 ppid=3000 pid=3070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.531000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Feb 13 08:17:54.531000 audit[3071]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_chain pid=3071 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.531000 audit[3071]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffb5f13320 a2=0 a3=7fffb5f1330c items=0 ppid=3000 pid=3071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.531000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 08:17:54.533000 audit[3073]: NETFILTER_CFG table=filter:74 family=2 entries=1 op=nft_register_rule pid=3073 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.533000 audit[3073]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffed7322b90 a2=0 a3=7ffed7322b7c items=0 ppid=3000 pid=3073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.533000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 08:17:54.533000 audit[3074]: NETFILTER_CFG table=filter:75 family=2 entries=1 op=nft_register_chain pid=3074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:54.533000 audit[3074]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff98cd7b00 a2=0 a3=7fff98cd7aec items=0 ppid=3000 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:54.533000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 08:17:55.109000 audit[3111]: NETFILTER_CFG table=filter:76 family=2 entries=1 op=nft_register_rule pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:55.109000 audit[3111]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff1bc76c80 a2=0 a3=7fff1bc76c6c items=0 ppid=3000 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.109000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 08:17:55.111000 audit[3117]: NETFILTER_CFG table=filter:77 family=2 entries=1 op=nft_register_rule pid=3117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:55.111000 audit[3117]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffe3fd06e0 a2=0 a3=7fffe3fd06cc items=0 ppid=3000 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.111000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 08:17:55.113000 audit[3126]: NETFILTER_CFG table=filter:78 family=2 entries=1 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:55.113000 audit[3126]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc258e62b0 a2=0 a3=7ffc258e629c items=0 ppid=3000 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.113000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 08:17:55.113000 audit[3127]: NETFILTER_CFG table=nat:79 family=2 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:55.113000 audit[3127]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff04f653d0 a2=0 a3=7fff04f653bc items=0 ppid=3000 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.113000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 08:17:55.114721 env[1563]: time="2024-02-13T08:17:55.114684307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-qckdw,Uid:61acb94e-d8b9-4841-8581-3eadbca84f94,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5faea167b4860c5427ec67f287a3748586a796844f9778eae4b19c1cb9a501cc\"" Feb 13 08:17:55.115493 env[1563]: time="2024-02-13T08:17:55.115478938Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\"" Feb 13 08:17:55.114000 audit[3129]: NETFILTER_CFG table=nat:80 family=2 entries=1 op=nft_register_rule pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:55.114000 audit[3129]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffe35cf0f20 a2=0 a3=7ffe35cf0f0c items=0 ppid=3000 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.114000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:17:55.116000 audit[3132]: NETFILTER_CFG table=nat:81 family=2 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:17:55.116000 audit[3132]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff623763d0 a2=0 a3=7fff623763bc items=0 ppid=3000 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.116000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:17:55.122000 audit[3136]: NETFILTER_CFG table=filter:82 family=2 entries=6 op=nft_register_rule pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:17:55.122000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffc4b2c0790 a2=0 a3=7ffc4b2c077c items=0 ppid=3000 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.122000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:55.125000 audit[3136]: NETFILTER_CFG table=nat:83 family=2 entries=17 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:17:55.125000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffc4b2c0790 a2=0 a3=7ffc4b2c077c items=0 ppid=3000 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.125000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:55.136000 audit[3139]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.136000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe09d4fd30 a2=0 a3=7ffe09d4fd1c items=0 ppid=3000 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.136000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 08:17:55.138620 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4224043437.mount: Deactivated successfully. Feb 13 08:17:55.137000 audit[3141]: NETFILTER_CFG table=filter:85 family=10 entries=2 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.137000 audit[3141]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffd31aa6b20 a2=0 a3=7ffd31aa6b0c items=0 ppid=3000 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.137000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Feb 13 08:17:55.139000 audit[3144]: NETFILTER_CFG table=filter:86 family=10 entries=2 op=nft_register_chain pid=3144 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.139000 audit[3144]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffcfd32e0d0 a2=0 a3=7ffcfd32e0bc items=0 ppid=3000 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.139000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Feb 13 08:17:55.140000 audit[3145]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.140000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffde32604d0 a2=0 a3=7ffde32604bc items=0 ppid=3000 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.140000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 08:17:55.141000 audit[3147]: NETFILTER_CFG table=filter:88 family=10 entries=1 op=nft_register_rule pid=3147 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.141000 audit[3147]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcf3914af0 a2=0 a3=7ffcf3914adc items=0 ppid=3000 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.141000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 08:17:55.141000 audit[3148]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3148 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.141000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc1e44ecc0 a2=0 a3=7ffc1e44ecac items=0 ppid=3000 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.141000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 08:17:55.143000 audit[3150]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.143000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdd015ee50 a2=0 a3=7ffdd015ee3c items=0 ppid=3000 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.143000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Feb 13 08:17:55.144000 audit[3153]: NETFILTER_CFG table=filter:91 family=10 entries=2 op=nft_register_chain pid=3153 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.144000 audit[3153]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffffe9269b0 a2=0 a3=7ffffe92699c items=0 ppid=3000 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.144000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 08:17:55.145000 audit[3154]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_chain pid=3154 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.145000 audit[3154]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdca27aab0 a2=0 a3=7ffdca27aa9c items=0 ppid=3000 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.145000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 08:17:55.146000 audit[3156]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.146000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd210e7f10 a2=0 a3=7ffd210e7efc items=0 ppid=3000 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.146000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 08:17:55.147000 audit[3157]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.147000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdbba6bd30 a2=0 a3=7ffdbba6bd1c items=0 ppid=3000 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.147000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 08:17:55.148000 audit[3159]: NETFILTER_CFG table=filter:95 family=10 entries=1 op=nft_register_rule pid=3159 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.148000 audit[3159]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd9430f6c0 a2=0 a3=7ffd9430f6ac items=0 ppid=3000 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.148000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 08:17:55.150000 audit[3162]: NETFILTER_CFG table=filter:96 family=10 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.150000 audit[3162]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd77d1c670 a2=0 a3=7ffd77d1c65c items=0 ppid=3000 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.150000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 08:17:55.152000 audit[3165]: NETFILTER_CFG table=filter:97 family=10 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.152000 audit[3165]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc70a166e0 a2=0 a3=7ffc70a166cc items=0 ppid=3000 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.152000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Feb 13 08:17:55.153000 audit[3166]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3166 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.153000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff99fd09f0 a2=0 a3=7fff99fd09dc items=0 ppid=3000 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.153000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 08:17:55.154000 audit[3168]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3168 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.154000 audit[3168]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7fff0cdba5a0 a2=0 a3=7fff0cdba58c items=0 ppid=3000 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.154000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:17:55.156000 audit[3171]: NETFILTER_CFG table=nat:100 family=10 entries=2 op=nft_register_chain pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:17:55.156000 audit[3171]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7fff5b376b10 a2=0 a3=7fff5b376afc items=0 ppid=3000 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.156000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:17:55.159000 audit[3175]: NETFILTER_CFG table=filter:101 family=10 entries=3 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 08:17:55.159000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffd810aae60 a2=0 a3=7ffd810aae4c items=0 ppid=3000 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.159000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:55.159000 audit[3175]: NETFILTER_CFG table=nat:102 family=10 entries=10 op=nft_register_chain pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 08:17:55.159000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=1968 a0=3 a1=7ffd810aae60 a2=0 a3=7ffd810aae4c items=0 ppid=3000 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:55.159000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:56.295088 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4241312978.mount: Deactivated successfully. Feb 13 08:17:57.366254 env[1563]: time="2024-02-13T08:17:57.366153074Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:57.370234 env[1563]: time="2024-02-13T08:17:57.370112007Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:57.375677 env[1563]: time="2024-02-13T08:17:57.375575347Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:57.428019 env[1563]: time="2024-02-13T08:17:57.427882957Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:715ac9a30f8a9579e44258af20de354715429e11836b493918e9e1a696e9b028,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:17:57.430547 env[1563]: time="2024-02-13T08:17:57.430417791Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\" returns image reference \"sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827\"" Feb 13 08:17:57.434825 env[1563]: time="2024-02-13T08:17:57.434711580Z" level=info msg="CreateContainer within sandbox \"5faea167b4860c5427ec67f287a3748586a796844f9778eae4b19c1cb9a501cc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 08:17:57.517299 env[1563]: time="2024-02-13T08:17:57.517157058Z" level=info msg="CreateContainer within sandbox \"5faea167b4860c5427ec67f287a3748586a796844f9778eae4b19c1cb9a501cc\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"eb78b34a55181534c6ee5261d74c1b185f501753f3756ec6324b9ebee157ae21\"" Feb 13 08:17:57.518194 env[1563]: time="2024-02-13T08:17:57.518149829Z" level=info msg="StartContainer for \"eb78b34a55181534c6ee5261d74c1b185f501753f3756ec6324b9ebee157ae21\"" Feb 13 08:17:57.556291 env[1563]: time="2024-02-13T08:17:57.556260845Z" level=info msg="StartContainer for \"eb78b34a55181534c6ee5261d74c1b185f501753f3756ec6324b9ebee157ae21\" returns successfully" Feb 13 08:17:59.126000 audit[3250]: NETFILTER_CFG table=filter:103 family=2 entries=13 op=nft_register_rule pid=3250 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:17:59.126000 audit[3250]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7fff50af3b50 a2=0 a3=7fff50af3b3c items=0 ppid=3000 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:59.126000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:59.126000 audit[3250]: NETFILTER_CFG table=nat:104 family=2 entries=20 op=nft_register_rule pid=3250 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:17:59.126000 audit[3250]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7fff50af3b50 a2=0 a3=7fff50af3b3c items=0 ppid=3000 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:59.126000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:59.161000 audit[3276]: NETFILTER_CFG table=filter:105 family=2 entries=14 op=nft_register_rule pid=3276 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:17:59.161000 audit[3276]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffc8e616fb0 a2=0 a3=7ffc8e616f9c items=0 ppid=3000 pid=3276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:59.161000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:59.162000 audit[3276]: NETFILTER_CFG table=nat:106 family=2 entries=20 op=nft_register_rule pid=3276 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:17:59.162000 audit[3276]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffc8e616fb0 a2=0 a3=7ffc8e616f9c items=0 ppid=3000 pid=3276 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:17:59.162000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:17:59.254332 kubelet[2738]: I0213 08:17:59.254270 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-cfc98749c-qckdw" podStartSLOduration=-9.223372031600588e+09 pod.CreationTimestamp="2024-02-13 08:17:54 +0000 UTC" firstStartedPulling="2024-02-13 08:17:55.115242572 +0000 UTC m=+14.642151638" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:17:57.714119288 +0000 UTC m=+17.241028432" watchObservedRunningTime="2024-02-13 08:17:59.254188156 +0000 UTC m=+18.781097265" Feb 13 08:17:59.255316 kubelet[2738]: I0213 08:17:59.254510 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:59.294906 kubelet[2738]: I0213 08:17:59.294877 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:59.421753 kubelet[2738]: I0213 08:17:59.421595 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:17:59.422305 kubelet[2738]: E0213 08:17:59.422258 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:17:59.449530 kubelet[2738]: I0213 08:17:59.449429 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/437f05ff-1528-4023-8ab8-92d5af256cf5-typha-certs\") pod \"calico-typha-64b6cc8f6f-dzbj8\" (UID: \"437f05ff-1528-4023-8ab8-92d5af256cf5\") " pod="calico-system/calico-typha-64b6cc8f6f-dzbj8" Feb 13 08:17:59.449775 kubelet[2738]: I0213 08:17:59.449704 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-policysync\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.449882 kubelet[2738]: I0213 08:17:59.449808 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-flexvol-driver-host\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.449882 kubelet[2738]: I0213 08:17:59.449873 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-lib-modules\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.450100 kubelet[2738]: I0213 08:17:59.449956 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-cni-net-dir\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.450100 kubelet[2738]: I0213 08:17:59.450050 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-cni-log-dir\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.450315 kubelet[2738]: I0213 08:17:59.450219 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf974\" (UniqueName: \"kubernetes.io/projected/437f05ff-1528-4023-8ab8-92d5af256cf5-kube-api-access-bf974\") pod \"calico-typha-64b6cc8f6f-dzbj8\" (UID: \"437f05ff-1528-4023-8ab8-92d5af256cf5\") " pod="calico-system/calico-typha-64b6cc8f6f-dzbj8" Feb 13 08:17:59.450490 kubelet[2738]: I0213 08:17:59.450421 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-var-run-calico\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.450665 kubelet[2738]: I0213 08:17:59.450606 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-xtables-lock\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.450883 kubelet[2738]: I0213 08:17:59.450847 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-var-lib-calico\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.451048 kubelet[2738]: I0213 08:17:59.451015 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-cni-bin-dir\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.451209 kubelet[2738]: I0213 08:17:59.451181 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/437f05ff-1528-4023-8ab8-92d5af256cf5-tigera-ca-bundle\") pod \"calico-typha-64b6cc8f6f-dzbj8\" (UID: \"437f05ff-1528-4023-8ab8-92d5af256cf5\") " pod="calico-system/calico-typha-64b6cc8f6f-dzbj8" Feb 13 08:17:59.451312 kubelet[2738]: I0213 08:17:59.451269 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-tigera-ca-bundle\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.451432 kubelet[2738]: I0213 08:17:59.451353 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-node-certs\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.451552 kubelet[2738]: I0213 08:17:59.451510 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktcjc\" (UniqueName: \"kubernetes.io/projected/63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a-kube-api-access-ktcjc\") pod \"calico-node-cph82\" (UID: \"63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a\") " pod="calico-system/calico-node-cph82" Feb 13 08:17:59.551928 kubelet[2738]: I0213 08:17:59.551875 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8100728f-8434-43ba-8770-5d3f00e1f18f-varrun\") pod \"csi-node-driver-mrqvb\" (UID: \"8100728f-8434-43ba-8770-5d3f00e1f18f\") " pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:17:59.552174 kubelet[2738]: I0213 08:17:59.552013 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8100728f-8434-43ba-8770-5d3f00e1f18f-kubelet-dir\") pod \"csi-node-driver-mrqvb\" (UID: \"8100728f-8434-43ba-8770-5d3f00e1f18f\") " pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:17:59.552174 kubelet[2738]: I0213 08:17:59.552078 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8100728f-8434-43ba-8770-5d3f00e1f18f-socket-dir\") pod \"csi-node-driver-mrqvb\" (UID: \"8100728f-8434-43ba-8770-5d3f00e1f18f\") " pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:17:59.552442 kubelet[2738]: I0213 08:17:59.552191 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zhwv\" (UniqueName: \"kubernetes.io/projected/8100728f-8434-43ba-8770-5d3f00e1f18f-kube-api-access-4zhwv\") pod \"csi-node-driver-mrqvb\" (UID: \"8100728f-8434-43ba-8770-5d3f00e1f18f\") " pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:17:59.552620 kubelet[2738]: I0213 08:17:59.552583 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8100728f-8434-43ba-8770-5d3f00e1f18f-registration-dir\") pod \"csi-node-driver-mrqvb\" (UID: \"8100728f-8434-43ba-8770-5d3f00e1f18f\") " pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:17:59.554084 kubelet[2738]: E0213 08:17:59.554049 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.554084 kubelet[2738]: W0213 08:17:59.554074 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.554343 kubelet[2738]: E0213 08:17:59.554111 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.554419 kubelet[2738]: E0213 08:17:59.554380 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.554419 kubelet[2738]: W0213 08:17:59.554396 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.554419 kubelet[2738]: E0213 08:17:59.554418 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.557687 kubelet[2738]: E0213 08:17:59.557652 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.557687 kubelet[2738]: W0213 08:17:59.557676 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.557906 kubelet[2738]: E0213 08:17:59.557709 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.558115 kubelet[2738]: E0213 08:17:59.558066 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.558115 kubelet[2738]: W0213 08:17:59.558083 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.558115 kubelet[2738]: E0213 08:17:59.558104 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.654635 kubelet[2738]: E0213 08:17:59.654579 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.654635 kubelet[2738]: W0213 08:17:59.654621 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.655056 kubelet[2738]: E0213 08:17:59.654681 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.655254 kubelet[2738]: E0213 08:17:59.655222 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.655488 kubelet[2738]: W0213 08:17:59.655252 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.655488 kubelet[2738]: E0213 08:17:59.655308 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.655766 kubelet[2738]: E0213 08:17:59.655736 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.655916 kubelet[2738]: W0213 08:17:59.655766 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.655916 kubelet[2738]: E0213 08:17:59.655817 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.656387 kubelet[2738]: E0213 08:17:59.656354 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.656515 kubelet[2738]: W0213 08:17:59.656390 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.656515 kubelet[2738]: E0213 08:17:59.656450 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.656799 kubelet[2738]: E0213 08:17:59.656778 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.656799 kubelet[2738]: W0213 08:17:59.656798 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.656981 kubelet[2738]: E0213 08:17:59.656832 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.657193 kubelet[2738]: E0213 08:17:59.657172 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.657193 kubelet[2738]: W0213 08:17:59.657192 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.657407 kubelet[2738]: E0213 08:17:59.657274 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.657574 kubelet[2738]: E0213 08:17:59.657551 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.657574 kubelet[2738]: W0213 08:17:59.657574 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.657768 kubelet[2738]: E0213 08:17:59.657637 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.657958 kubelet[2738]: E0213 08:17:59.657938 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.657958 kubelet[2738]: W0213 08:17:59.657958 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.658229 kubelet[2738]: E0213 08:17:59.658054 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.658476 kubelet[2738]: E0213 08:17:59.658442 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.658476 kubelet[2738]: W0213 08:17:59.658472 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.658723 kubelet[2738]: E0213 08:17:59.658516 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.658942 kubelet[2738]: E0213 08:17:59.658913 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.658942 kubelet[2738]: W0213 08:17:59.658936 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.659235 kubelet[2738]: E0213 08:17:59.658973 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.659470 kubelet[2738]: E0213 08:17:59.659438 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.659470 kubelet[2738]: W0213 08:17:59.659467 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.659693 kubelet[2738]: E0213 08:17:59.659508 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.659865 kubelet[2738]: E0213 08:17:59.659829 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.659865 kubelet[2738]: W0213 08:17:59.659850 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.660151 kubelet[2738]: E0213 08:17:59.659929 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.660256 kubelet[2738]: E0213 08:17:59.660204 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.660256 kubelet[2738]: W0213 08:17:59.660234 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.660432 kubelet[2738]: E0213 08:17:59.660361 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.660689 kubelet[2738]: E0213 08:17:59.660665 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.660689 kubelet[2738]: W0213 08:17:59.660687 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.660900 kubelet[2738]: E0213 08:17:59.660794 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.661081 kubelet[2738]: E0213 08:17:59.661057 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.661081 kubelet[2738]: W0213 08:17:59.661079 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.661282 kubelet[2738]: E0213 08:17:59.661183 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.661525 kubelet[2738]: E0213 08:17:59.661499 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.661645 kubelet[2738]: W0213 08:17:59.661527 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.661742 kubelet[2738]: E0213 08:17:59.661639 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.661912 kubelet[2738]: E0213 08:17:59.661886 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.662035 kubelet[2738]: W0213 08:17:59.661915 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.662035 kubelet[2738]: E0213 08:17:59.661973 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.662424 kubelet[2738]: E0213 08:17:59.662395 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.662543 kubelet[2738]: W0213 08:17:59.662432 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.662646 kubelet[2738]: E0213 08:17:59.662557 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.662879 kubelet[2738]: E0213 08:17:59.662852 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.662879 kubelet[2738]: W0213 08:17:59.662874 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.663120 kubelet[2738]: E0213 08:17:59.662958 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.663324 kubelet[2738]: E0213 08:17:59.663262 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.663324 kubelet[2738]: W0213 08:17:59.663292 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.663599 kubelet[2738]: E0213 08:17:59.663383 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.663748 kubelet[2738]: E0213 08:17:59.663700 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.663748 kubelet[2738]: W0213 08:17:59.663721 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.664040 kubelet[2738]: E0213 08:17:59.663824 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.664252 kubelet[2738]: E0213 08:17:59.664217 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.664252 kubelet[2738]: W0213 08:17:59.664250 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.664538 kubelet[2738]: E0213 08:17:59.664359 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.664640 kubelet[2738]: E0213 08:17:59.664621 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.664749 kubelet[2738]: W0213 08:17:59.664642 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.664749 kubelet[2738]: E0213 08:17:59.664696 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.665089 kubelet[2738]: E0213 08:17:59.665019 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.665089 kubelet[2738]: W0213 08:17:59.665041 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.665089 kubelet[2738]: E0213 08:17:59.665075 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.665574 kubelet[2738]: E0213 08:17:59.665550 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.665574 kubelet[2738]: W0213 08:17:59.665573 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.665770 kubelet[2738]: E0213 08:17:59.665606 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.665983 kubelet[2738]: E0213 08:17:59.665957 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.666161 kubelet[2738]: W0213 08:17:59.665987 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.666161 kubelet[2738]: E0213 08:17:59.666049 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.666687 kubelet[2738]: E0213 08:17:59.666649 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.666687 kubelet[2738]: W0213 08:17:59.666673 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.666860 kubelet[2738]: E0213 08:17:59.666704 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.676467 kubelet[2738]: E0213 08:17:59.676319 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.676467 kubelet[2738]: W0213 08:17:59.676351 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.676467 kubelet[2738]: E0213 08:17:59.676385 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.764322 kubelet[2738]: E0213 08:17:59.764243 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.764322 kubelet[2738]: W0213 08:17:59.764276 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.764322 kubelet[2738]: E0213 08:17:59.764313 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.764874 kubelet[2738]: E0213 08:17:59.764810 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.764874 kubelet[2738]: W0213 08:17:59.764840 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.764874 kubelet[2738]: E0213 08:17:59.764874 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.865641 kubelet[2738]: E0213 08:17:59.865590 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.865641 kubelet[2738]: W0213 08:17:59.865630 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.866035 kubelet[2738]: E0213 08:17:59.865683 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.866236 kubelet[2738]: E0213 08:17:59.866203 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.866236 kubelet[2738]: W0213 08:17:59.866231 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.866503 kubelet[2738]: E0213 08:17:59.866277 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.873602 kubelet[2738]: E0213 08:17:59.873560 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.873602 kubelet[2738]: W0213 08:17:59.873592 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.873844 kubelet[2738]: E0213 08:17:59.873630 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.898595 env[1563]: time="2024-02-13T08:17:59.898467104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cph82,Uid:63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a,Namespace:calico-system,Attempt:0,}" Feb 13 08:17:59.922641 env[1563]: time="2024-02-13T08:17:59.922463664Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:17:59.922641 env[1563]: time="2024-02-13T08:17:59.922566679Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:17:59.922641 env[1563]: time="2024-02-13T08:17:59.922600683Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:17:59.923117 env[1563]: time="2024-02-13T08:17:59.922966336Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5 pid=3325 runtime=io.containerd.runc.v2 Feb 13 08:17:59.967289 kubelet[2738]: E0213 08:17:59.967178 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:17:59.967289 kubelet[2738]: W0213 08:17:59.967204 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:17:59.967289 kubelet[2738]: E0213 08:17:59.967245 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:17:59.978556 env[1563]: time="2024-02-13T08:17:59.978477273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cph82,Uid:63c4effc-2c4f-49d0-9f2b-0e5b5d094a4a,Namespace:calico-system,Attempt:0,} returns sandbox id \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\"" Feb 13 08:17:59.980472 env[1563]: time="2024-02-13T08:17:59.980421738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\"" Feb 13 08:18:00.068578 kubelet[2738]: E0213 08:18:00.068495 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:18:00.068578 kubelet[2738]: W0213 08:18:00.068531 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:18:00.068578 kubelet[2738]: E0213 08:18:00.068580 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:18:00.077319 kubelet[2738]: E0213 08:18:00.077263 2738 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:18:00.077319 kubelet[2738]: W0213 08:18:00.077298 2738 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:18:00.077617 kubelet[2738]: E0213 08:18:00.077336 2738 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:18:00.161719 env[1563]: time="2024-02-13T08:18:00.161661614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64b6cc8f6f-dzbj8,Uid:437f05ff-1528-4023-8ab8-92d5af256cf5,Namespace:calico-system,Attempt:0,}" Feb 13 08:18:00.169006 env[1563]: time="2024-02-13T08:18:00.168955733Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:18:00.169006 env[1563]: time="2024-02-13T08:18:00.168982481Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:18:00.169127 env[1563]: time="2024-02-13T08:18:00.169006532Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:18:00.169127 env[1563]: time="2024-02-13T08:18:00.169101772Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/1e19e923d209b869d594a5a99992b9971e91eb828e2640df7e1e691f080f9cac pid=3369 runtime=io.containerd.runc.v2 Feb 13 08:18:00.203000 audit[3426]: NETFILTER_CFG table=filter:107 family=2 entries=14 op=nft_register_rule pid=3426 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:00.205283 env[1563]: time="2024-02-13T08:18:00.205259918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64b6cc8f6f-dzbj8,Uid:437f05ff-1528-4023-8ab8-92d5af256cf5,Namespace:calico-system,Attempt:0,} returns sandbox id \"1e19e923d209b869d594a5a99992b9971e91eb828e2640df7e1e691f080f9cac\"" Feb 13 08:18:00.231021 kernel: kauditd_printk_skb: 134 callbacks suppressed Feb 13 08:18:00.231100 kernel: audit: type=1325 audit(1707812280.203:277): table=filter:107 family=2 entries=14 op=nft_register_rule pid=3426 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:00.203000 audit[3426]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffd6b06f960 a2=0 a3=7ffd6b06f94c items=0 ppid=3000 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:00.387262 kernel: audit: type=1300 audit(1707812280.203:277): arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffd6b06f960 a2=0 a3=7ffd6b06f94c items=0 ppid=3000 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:00.387310 kernel: audit: type=1327 audit(1707812280.203:277): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:00.203000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:00.203000 audit[3426]: NETFILTER_CFG table=nat:108 family=2 entries=20 op=nft_register_rule pid=3426 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:00.203000 audit[3426]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffd6b06f960 a2=0 a3=7ffd6b06f94c items=0 ppid=3000 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:00.612753 kernel: audit: type=1325 audit(1707812280.203:278): table=nat:108 family=2 entries=20 op=nft_register_rule pid=3426 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:00.612837 kernel: audit: type=1300 audit(1707812280.203:278): arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffd6b06f960 a2=0 a3=7ffd6b06f94c items=0 ppid=3000 pid=3426 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:00.612862 kernel: audit: type=1327 audit(1707812280.203:278): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:00.203000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:01.656393 kubelet[2738]: E0213 08:18:01.656301 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:03.656540 kubelet[2738]: E0213 08:18:03.656432 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:03.685781 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3314127629.mount: Deactivated successfully. Feb 13 08:18:05.655533 kubelet[2738]: E0213 08:18:05.655480 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:07.655835 kubelet[2738]: E0213 08:18:07.655772 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:09.655737 kubelet[2738]: E0213 08:18:09.655612 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:11.656518 kubelet[2738]: E0213 08:18:11.656414 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:13.655788 kubelet[2738]: E0213 08:18:13.655695 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:15.656703 kubelet[2738]: E0213 08:18:15.656588 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:17.656132 kubelet[2738]: E0213 08:18:17.656027 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:19.655863 kubelet[2738]: E0213 08:18:19.655799 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:21.655943 kubelet[2738]: E0213 08:18:21.655917 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:23.656184 kubelet[2738]: E0213 08:18:23.656120 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:25.656389 kubelet[2738]: E0213 08:18:25.656282 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:27.655948 kubelet[2738]: E0213 08:18:27.655887 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:29.656236 kubelet[2738]: E0213 08:18:29.656141 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:31.146295 env[1563]: time="2024-02-13T08:18:31.146238269Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:31.146887 env[1563]: time="2024-02-13T08:18:31.146848906Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:31.148358 env[1563]: time="2024-02-13T08:18:31.148316778Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:31.149394 env[1563]: time="2024-02-13T08:18:31.149350985Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b05edbd1f80db4ada229e6001a666a7dd36bb6ab617143684fb3d28abfc4b71e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:31.149915 env[1563]: time="2024-02-13T08:18:31.149871310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\" returns image reference \"sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a\"" Feb 13 08:18:31.150386 env[1563]: time="2024-02-13T08:18:31.150356078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\"" Feb 13 08:18:31.151107 env[1563]: time="2024-02-13T08:18:31.151093623Z" level=info msg="CreateContainer within sandbox \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 08:18:31.156427 env[1563]: time="2024-02-13T08:18:31.156382198Z" level=info msg="CreateContainer within sandbox \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e164da54a425029b5dc255122d76223b60a7cf0b00e43f9c1c40d50bdf8ec8f4\"" Feb 13 08:18:31.156752 env[1563]: time="2024-02-13T08:18:31.156718378Z" level=info msg="StartContainer for \"e164da54a425029b5dc255122d76223b60a7cf0b00e43f9c1c40d50bdf8ec8f4\"" Feb 13 08:18:31.180558 env[1563]: time="2024-02-13T08:18:31.180507479Z" level=info msg="StartContainer for \"e164da54a425029b5dc255122d76223b60a7cf0b00e43f9c1c40d50bdf8ec8f4\" returns successfully" Feb 13 08:18:31.261177 env[1563]: time="2024-02-13T08:18:31.261067828Z" level=info msg="shim disconnected" id=e164da54a425029b5dc255122d76223b60a7cf0b00e43f9c1c40d50bdf8ec8f4 Feb 13 08:18:31.261177 env[1563]: time="2024-02-13T08:18:31.261168768Z" level=warning msg="cleaning up after shim disconnected" id=e164da54a425029b5dc255122d76223b60a7cf0b00e43f9c1c40d50bdf8ec8f4 namespace=k8s.io Feb 13 08:18:31.261828 env[1563]: time="2024-02-13T08:18:31.261198365Z" level=info msg="cleaning up dead shim" Feb 13 08:18:31.276658 env[1563]: time="2024-02-13T08:18:31.276572058Z" level=warning msg="cleanup warnings time=\"2024-02-13T08:18:31Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3473 runtime=io.containerd.runc.v2\n" Feb 13 08:18:31.657502 kubelet[2738]: E0213 08:18:31.657418 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:32.160663 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e164da54a425029b5dc255122d76223b60a7cf0b00e43f9c1c40d50bdf8ec8f4-rootfs.mount: Deactivated successfully. Feb 13 08:18:33.656651 kubelet[2738]: E0213 08:18:33.656552 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:35.655885 kubelet[2738]: E0213 08:18:35.655851 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:37.272346 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount92169312.mount: Deactivated successfully. Feb 13 08:18:37.656103 kubelet[2738]: E0213 08:18:37.655988 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:39.655598 kubelet[2738]: E0213 08:18:39.655542 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:41.656008 kubelet[2738]: E0213 08:18:41.655943 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:43.655514 kubelet[2738]: E0213 08:18:43.655461 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:45.655764 kubelet[2738]: E0213 08:18:45.655729 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:47.656776 kubelet[2738]: E0213 08:18:47.656656 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:49.655645 kubelet[2738]: E0213 08:18:49.655588 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:51.655790 kubelet[2738]: E0213 08:18:51.655669 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:53.655647 kubelet[2738]: E0213 08:18:53.655523 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:55.656157 kubelet[2738]: E0213 08:18:55.656077 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:57.473649 env[1563]: time="2024-02-13T08:18:57.473602722Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:57.474302 env[1563]: time="2024-02-13T08:18:57.474244655Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:57.476127 env[1563]: time="2024-02-13T08:18:57.476087528Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:57.477559 env[1563]: time="2024-02-13T08:18:57.477519009Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:5f2d3b8c354a4eb6de46e786889913916e620c6c256982fb8d0f1a1d36a282bc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:18:57.478059 env[1563]: time="2024-02-13T08:18:57.478001809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\" returns image reference \"sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c\"" Feb 13 08:18:57.478605 env[1563]: time="2024-02-13T08:18:57.478591815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\"" Feb 13 08:18:57.481973 env[1563]: time="2024-02-13T08:18:57.481953866Z" level=info msg="CreateContainer within sandbox \"1e19e923d209b869d594a5a99992b9971e91eb828e2640df7e1e691f080f9cac\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 08:18:57.486216 env[1563]: time="2024-02-13T08:18:57.486196318Z" level=info msg="CreateContainer within sandbox \"1e19e923d209b869d594a5a99992b9971e91eb828e2640df7e1e691f080f9cac\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"067d8f992d64264c0411834e7e68e8ddefdc280a011253f92cc92650ffa10805\"" Feb 13 08:18:57.486590 env[1563]: time="2024-02-13T08:18:57.486552430Z" level=info msg="StartContainer for \"067d8f992d64264c0411834e7e68e8ddefdc280a011253f92cc92650ffa10805\"" Feb 13 08:18:57.519752 env[1563]: time="2024-02-13T08:18:57.519724870Z" level=info msg="StartContainer for \"067d8f992d64264c0411834e7e68e8ddefdc280a011253f92cc92650ffa10805\" returns successfully" Feb 13 08:18:57.656597 kubelet[2738]: E0213 08:18:57.656491 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:18:57.879457 kubelet[2738]: I0213 08:18:57.879398 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-64b6cc8f6f-dzbj8" podStartSLOduration=-9.223371977975464e+09 pod.CreationTimestamp="2024-02-13 08:17:59 +0000 UTC" firstStartedPulling="2024-02-13 08:18:00.205855255 +0000 UTC m=+19.732764329" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:18:57.878437063 +0000 UTC m=+77.405346224" watchObservedRunningTime="2024-02-13 08:18:57.879312334 +0000 UTC m=+77.406221454" Feb 13 08:18:57.923000 audit[3576]: NETFILTER_CFG table=filter:109 family=2 entries=13 op=nft_register_rule pid=3576 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:57.923000 audit[3576]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffd5c0202a0 a2=0 a3=7ffd5c02028c items=0 ppid=3000 pid=3576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:58.082340 kernel: audit: type=1325 audit(1707812337.923:279): table=filter:109 family=2 entries=13 op=nft_register_rule pid=3576 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:58.082403 kernel: audit: type=1300 audit(1707812337.923:279): arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffd5c0202a0 a2=0 a3=7ffd5c02028c items=0 ppid=3000 pid=3576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:58.082420 kernel: audit: type=1327 audit(1707812337.923:279): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:57.923000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:57.923000 audit[3576]: NETFILTER_CFG table=nat:110 family=2 entries=27 op=nft_register_chain pid=3576 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:58.200322 kernel: audit: type=1325 audit(1707812337.923:280): table=nat:110 family=2 entries=27 op=nft_register_chain pid=3576 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:18:58.200351 kernel: audit: type=1300 audit(1707812337.923:280): arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffd5c0202a0 a2=0 a3=7ffd5c02028c items=0 ppid=3000 pid=3576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:57.923000 audit[3576]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffd5c0202a0 a2=0 a3=7ffd5c02028c items=0 ppid=3000 pid=3576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:18:58.298657 kernel: audit: type=1327 audit(1707812337.923:280): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:57.923000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:18:59.655834 kubelet[2738]: E0213 08:18:59.655804 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:00.833259 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1138309287.mount: Deactivated successfully. Feb 13 08:19:01.656384 kubelet[2738]: E0213 08:19:01.656332 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:03.655943 kubelet[2738]: E0213 08:19:03.655913 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:05.656336 kubelet[2738]: E0213 08:19:05.656312 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:07.656583 kubelet[2738]: E0213 08:19:07.656520 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:09.656178 kubelet[2738]: E0213 08:19:09.656123 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:11.656320 kubelet[2738]: E0213 08:19:11.656269 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:13.655649 kubelet[2738]: E0213 08:19:13.655597 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:15.656284 kubelet[2738]: E0213 08:19:15.656230 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:17.656039 kubelet[2738]: E0213 08:19:17.655959 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:19.655683 kubelet[2738]: E0213 08:19:19.655630 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:21.655332 kubelet[2738]: E0213 08:19:21.655280 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:23.655563 kubelet[2738]: E0213 08:19:23.655509 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:25.655897 kubelet[2738]: E0213 08:19:25.655841 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:27.656440 kubelet[2738]: E0213 08:19:27.656369 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:29.656045 kubelet[2738]: E0213 08:19:29.656012 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:31.655972 kubelet[2738]: E0213 08:19:31.655883 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:33.656177 kubelet[2738]: E0213 08:19:33.656071 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:35.655872 kubelet[2738]: E0213 08:19:35.655804 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:37.655700 kubelet[2738]: E0213 08:19:37.655666 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:39.655983 kubelet[2738]: E0213 08:19:39.655929 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:40.735536 kubelet[2738]: E0213 08:19:40.735455 2738 kubelet_node_status.go:452] "Node not becoming ready in time after startup" Feb 13 08:19:41.655578 kubelet[2738]: E0213 08:19:41.655523 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:43.656545 kubelet[2738]: E0213 08:19:43.656421 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:45.655913 kubelet[2738]: E0213 08:19:45.655790 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:45.709031 kubelet[2738]: E0213 08:19:45.708935 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:19:47.656497 kubelet[2738]: E0213 08:19:47.656379 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:49.656723 kubelet[2738]: E0213 08:19:49.656602 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:50.711118 kubelet[2738]: E0213 08:19:50.711026 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:19:51.655953 kubelet[2738]: E0213 08:19:51.655900 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:53.655691 kubelet[2738]: E0213 08:19:53.655582 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:55.655604 kubelet[2738]: E0213 08:19:55.655548 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:55.712503 kubelet[2738]: E0213 08:19:55.712403 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:19:57.656063 kubelet[2738]: E0213 08:19:57.655946 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:19:59.656377 kubelet[2738]: E0213 08:19:59.656261 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:00.714321 kubelet[2738]: E0213 08:20:00.714226 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:01.655593 kubelet[2738]: E0213 08:20:01.655488 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:03.655703 kubelet[2738]: E0213 08:20:03.655668 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:05.656570 kubelet[2738]: E0213 08:20:05.656460 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:05.715912 kubelet[2738]: E0213 08:20:05.715811 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:07.655668 kubelet[2738]: E0213 08:20:07.655567 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:09.655935 kubelet[2738]: E0213 08:20:09.655904 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:10.717205 kubelet[2738]: E0213 08:20:10.717106 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:11.655420 kubelet[2738]: E0213 08:20:11.655387 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:13.656028 kubelet[2738]: E0213 08:20:13.655942 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:15.656386 kubelet[2738]: E0213 08:20:15.656278 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:15.718222 kubelet[2738]: E0213 08:20:15.718160 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:17.656226 kubelet[2738]: E0213 08:20:17.656172 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:19.656593 kubelet[2738]: E0213 08:20:19.656490 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:20.719390 kubelet[2738]: E0213 08:20:20.719289 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:21.655325 kubelet[2738]: E0213 08:20:21.655291 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:23.655719 kubelet[2738]: E0213 08:20:23.655616 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:25.655447 kubelet[2738]: E0213 08:20:25.655385 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:25.721487 kubelet[2738]: E0213 08:20:25.721393 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:27.655646 kubelet[2738]: E0213 08:20:27.655612 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:29.656074 kubelet[2738]: E0213 08:20:29.656036 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:30.722725 kubelet[2738]: E0213 08:20:30.722675 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:31.655873 kubelet[2738]: E0213 08:20:31.655813 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:33.656648 kubelet[2738]: E0213 08:20:33.656557 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:35.656632 kubelet[2738]: E0213 08:20:35.656534 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:35.723870 kubelet[2738]: E0213 08:20:35.723760 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:37.655913 kubelet[2738]: E0213 08:20:37.655856 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:39.655866 kubelet[2738]: E0213 08:20:39.655811 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:40.725206 kubelet[2738]: E0213 08:20:40.725139 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:41.655945 kubelet[2738]: E0213 08:20:41.655837 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:43.656173 kubelet[2738]: E0213 08:20:43.656054 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:45.655764 kubelet[2738]: E0213 08:20:45.655702 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:45.726845 kubelet[2738]: E0213 08:20:45.726795 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:47.656759 kubelet[2738]: E0213 08:20:47.656634 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:49.655971 kubelet[2738]: E0213 08:20:49.655844 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:50.728460 kubelet[2738]: E0213 08:20:50.728355 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:51.656632 kubelet[2738]: E0213 08:20:51.656513 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:53.656140 kubelet[2738]: E0213 08:20:53.656019 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:55.655907 kubelet[2738]: E0213 08:20:55.655833 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:55.729902 kubelet[2738]: E0213 08:20:55.729812 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:20:57.656155 kubelet[2738]: E0213 08:20:57.656103 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:20:59.655750 kubelet[2738]: E0213 08:20:59.655696 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:00.731787 kubelet[2738]: E0213 08:21:00.731713 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:21:01.656564 kubelet[2738]: E0213 08:21:01.656473 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:03.655618 kubelet[2738]: E0213 08:21:03.655572 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:05.656125 kubelet[2738]: E0213 08:21:05.656063 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:05.733264 kubelet[2738]: E0213 08:21:05.733200 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:21:07.656185 kubelet[2738]: E0213 08:21:07.656097 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:09.656692 kubelet[2738]: E0213 08:21:09.656595 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:10.734742 kubelet[2738]: E0213 08:21:10.734635 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:21:11.656239 kubelet[2738]: E0213 08:21:11.656131 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:13.655741 kubelet[2738]: E0213 08:21:13.655663 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:15.656605 kubelet[2738]: E0213 08:21:15.656480 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:15.736776 kubelet[2738]: E0213 08:21:15.736676 2738 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:21:17.655378 kubelet[2738]: E0213 08:21:17.655328 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:18.619410 env[1563]: time="2024-02-13T08:21:18.619387189Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:21:18.620120 env[1563]: time="2024-02-13T08:21:18.620074279Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:21:18.621006 env[1563]: time="2024-02-13T08:21:18.620963249Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:21:18.622036 env[1563]: time="2024-02-13T08:21:18.621967885Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:d943b4c23e82a39b0186a1a3b2fe8f728e543d503df72d7be521501a82b7e7b4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:21:18.622461 env[1563]: time="2024-02-13T08:21:18.622419733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\" returns image reference \"sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93\"" Feb 13 08:21:18.634997 env[1563]: time="2024-02-13T08:21:18.634973357Z" level=info msg="CreateContainer within sandbox \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 08:21:18.640152 env[1563]: time="2024-02-13T08:21:18.640084963Z" level=info msg="CreateContainer within sandbox \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1b7d7f4acd6f61cd163ab3a04a24b56e4cf7b8963f5182a843dce855b9aa7edc\"" Feb 13 08:21:18.640475 env[1563]: time="2024-02-13T08:21:18.640432236Z" level=info msg="StartContainer for \"1b7d7f4acd6f61cd163ab3a04a24b56e4cf7b8963f5182a843dce855b9aa7edc\"" Feb 13 08:21:18.665051 env[1563]: time="2024-02-13T08:21:18.664996242Z" level=info msg="StartContainer for \"1b7d7f4acd6f61cd163ab3a04a24b56e4cf7b8963f5182a843dce855b9aa7edc\" returns successfully" Feb 13 08:21:19.407358 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1b7d7f4acd6f61cd163ab3a04a24b56e4cf7b8963f5182a843dce855b9aa7edc-rootfs.mount: Deactivated successfully. Feb 13 08:21:19.655868 kubelet[2738]: E0213 08:21:19.655793 2738 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:19.900415 env[1563]: time="2024-02-13T08:21:19.900311609Z" level=info msg="shim disconnected" id=1b7d7f4acd6f61cd163ab3a04a24b56e4cf7b8963f5182a843dce855b9aa7edc Feb 13 08:21:19.900415 env[1563]: time="2024-02-13T08:21:19.900407577Z" level=warning msg="cleaning up after shim disconnected" id=1b7d7f4acd6f61cd163ab3a04a24b56e4cf7b8963f5182a843dce855b9aa7edc namespace=k8s.io Feb 13 08:21:19.901508 env[1563]: time="2024-02-13T08:21:19.900434540Z" level=info msg="cleaning up dead shim" Feb 13 08:21:19.908258 env[1563]: time="2024-02-13T08:21:19.908216375Z" level=warning msg="cleanup warnings time=\"2024-02-13T08:21:19Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3658 runtime=io.containerd.runc.v2\n" Feb 13 08:21:20.228205 env[1563]: time="2024-02-13T08:21:20.227977111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.27.0\"" Feb 13 08:21:21.661450 env[1563]: time="2024-02-13T08:21:21.661427059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mrqvb,Uid:8100728f-8434-43ba-8770-5d3f00e1f18f,Namespace:calico-system,Attempt:0,}" Feb 13 08:21:21.689461 env[1563]: time="2024-02-13T08:21:21.689407662Z" level=error msg="Failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:21.689676 env[1563]: time="2024-02-13T08:21:21.689629882Z" level=error msg="encountered an error cleaning up failed sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:21.689676 env[1563]: time="2024-02-13T08:21:21.689659658Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mrqvb,Uid:8100728f-8434-43ba-8770-5d3f00e1f18f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:21.689845 kubelet[2738]: E0213 08:21:21.689807 2738 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:21.690042 kubelet[2738]: E0213 08:21:21.689846 2738 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:21:21.690042 kubelet[2738]: E0213 08:21:21.689862 2738 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mrqvb" Feb 13 08:21:21.690042 kubelet[2738]: E0213 08:21:21.689895 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mrqvb_calico-system(8100728f-8434-43ba-8770-5d3f00e1f18f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mrqvb_calico-system(8100728f-8434-43ba-8770-5d3f00e1f18f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:21.690973 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f-shm.mount: Deactivated successfully. Feb 13 08:21:22.233608 kubelet[2738]: I0213 08:21:22.233513 2738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:21:22.234922 env[1563]: time="2024-02-13T08:21:22.234809301Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:21:22.262332 env[1563]: time="2024-02-13T08:21:22.262269427Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:22.262612 kubelet[2738]: E0213 08:21:22.262494 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:21:22.262669 kubelet[2738]: E0213 08:21:22.262619 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:21:22.262669 kubelet[2738]: E0213 08:21:22.262668 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:22.262742 kubelet[2738]: E0213 08:21:22.262694 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:24.143001 kubelet[2738]: I0213 08:21:24.142977 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:21:24.143955 kubelet[2738]: I0213 08:21:24.143942 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:21:24.144140 kubelet[2738]: I0213 08:21:24.144127 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:21:24.330298 kubelet[2738]: I0213 08:21:24.330233 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cg9r\" (UniqueName: \"kubernetes.io/projected/7d5a107e-32fc-46ef-9ba6-381664363494-kube-api-access-4cg9r\") pod \"coredns-787d4945fb-vhpxp\" (UID: \"7d5a107e-32fc-46ef-9ba6-381664363494\") " pod="kube-system/coredns-787d4945fb-vhpxp" Feb 13 08:21:24.330655 kubelet[2738]: I0213 08:21:24.330340 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fbp\" (UniqueName: \"kubernetes.io/projected/26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0-kube-api-access-p2fbp\") pod \"coredns-787d4945fb-tr9dh\" (UID: \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\") " pod="kube-system/coredns-787d4945fb-tr9dh" Feb 13 08:21:24.330655 kubelet[2738]: I0213 08:21:24.330499 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d-tigera-ca-bundle\") pod \"calico-kube-controllers-646c8f86fc-xkf58\" (UID: \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\") " pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" Feb 13 08:21:24.330944 kubelet[2738]: I0213 08:21:24.330677 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d5a107e-32fc-46ef-9ba6-381664363494-config-volume\") pod \"coredns-787d4945fb-vhpxp\" (UID: \"7d5a107e-32fc-46ef-9ba6-381664363494\") " pod="kube-system/coredns-787d4945fb-vhpxp" Feb 13 08:21:24.330944 kubelet[2738]: I0213 08:21:24.330852 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52xr\" (UniqueName: \"kubernetes.io/projected/d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d-kube-api-access-m52xr\") pod \"calico-kube-controllers-646c8f86fc-xkf58\" (UID: \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\") " pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" Feb 13 08:21:24.330944 kubelet[2738]: I0213 08:21:24.330927 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0-config-volume\") pod \"coredns-787d4945fb-tr9dh\" (UID: \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\") " pod="kube-system/coredns-787d4945fb-tr9dh" Feb 13 08:21:24.746889 env[1563]: time="2024-02-13T08:21:24.746742935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-vhpxp,Uid:7d5a107e-32fc-46ef-9ba6-381664363494,Namespace:kube-system,Attempt:0,}" Feb 13 08:21:24.746889 env[1563]: time="2024-02-13T08:21:24.746791862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-646c8f86fc-xkf58,Uid:d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d,Namespace:calico-system,Attempt:0,}" Feb 13 08:21:24.747879 env[1563]: time="2024-02-13T08:21:24.747070306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-tr9dh,Uid:26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0,Namespace:kube-system,Attempt:0,}" Feb 13 08:21:24.787860 env[1563]: time="2024-02-13T08:21:24.787819538Z" level=error msg="Failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.787964 env[1563]: time="2024-02-13T08:21:24.787938742Z" level=error msg="Failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788112 env[1563]: time="2024-02-13T08:21:24.788093387Z" level=error msg="encountered an error cleaning up failed sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788184 env[1563]: time="2024-02-13T08:21:24.788131615Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-646c8f86fc-xkf58,Uid:d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788184 env[1563]: time="2024-02-13T08:21:24.788158359Z" level=error msg="encountered an error cleaning up failed sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788255 env[1563]: time="2024-02-13T08:21:24.788187339Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-vhpxp,Uid:7d5a107e-32fc-46ef-9ba6-381664363494,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788322 kubelet[2738]: E0213 08:21:24.788311 2738 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788356 kubelet[2738]: E0213 08:21:24.788347 2738 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-vhpxp" Feb 13 08:21:24.788379 kubelet[2738]: E0213 08:21:24.788361 2738 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-vhpxp" Feb 13 08:21:24.788401 kubelet[2738]: E0213 08:21:24.788314 2738 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788401 kubelet[2738]: E0213 08:21:24.788393 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-vhpxp_kube-system(7d5a107e-32fc-46ef-9ba6-381664363494)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-vhpxp_kube-system(7d5a107e-32fc-46ef-9ba6-381664363494)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:21:24.788401 kubelet[2738]: E0213 08:21:24.788400 2738 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" Feb 13 08:21:24.788483 kubelet[2738]: E0213 08:21:24.788413 2738 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" Feb 13 08:21:24.788483 kubelet[2738]: E0213 08:21:24.788435 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-646c8f86fc-xkf58_calico-system(d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-646c8f86fc-xkf58_calico-system(d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:21:24.788543 env[1563]: time="2024-02-13T08:21:24.788394008Z" level=error msg="Failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788567 env[1563]: time="2024-02-13T08:21:24.788545480Z" level=error msg="encountered an error cleaning up failed sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788607 env[1563]: time="2024-02-13T08:21:24.788566301Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-tr9dh,Uid:26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788666 kubelet[2738]: E0213 08:21:24.788660 2738 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:24.788689 kubelet[2738]: E0213 08:21:24.788676 2738 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-tr9dh" Feb 13 08:21:24.788689 kubelet[2738]: E0213 08:21:24.788687 2738 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-tr9dh" Feb 13 08:21:24.788737 kubelet[2738]: E0213 08:21:24.788706 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-tr9dh_kube-system(26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-tr9dh_kube-system(26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:21:24.789260 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb-shm.mount: Deactivated successfully. Feb 13 08:21:24.789346 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9-shm.mount: Deactivated successfully. Feb 13 08:21:25.245430 kubelet[2738]: I0213 08:21:25.245373 2738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:21:25.246859 env[1563]: time="2024-02-13T08:21:25.246785096Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:21:25.247555 kubelet[2738]: I0213 08:21:25.247508 2738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:21:25.248727 env[1563]: time="2024-02-13T08:21:25.248629109Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:21:25.249716 kubelet[2738]: I0213 08:21:25.249657 2738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:21:25.250732 env[1563]: time="2024-02-13T08:21:25.250656475Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:21:25.294294 env[1563]: time="2024-02-13T08:21:25.294215888Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:25.294427 env[1563]: time="2024-02-13T08:21:25.294332688Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:25.294472 kubelet[2738]: E0213 08:21:25.294442 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:21:25.294522 kubelet[2738]: E0213 08:21:25.294484 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:21:25.294522 kubelet[2738]: E0213 08:21:25.294486 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:21:25.294522 kubelet[2738]: E0213 08:21:25.294516 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:21:25.294630 kubelet[2738]: E0213 08:21:25.294525 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:25.294630 kubelet[2738]: E0213 08:21:25.294554 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:25.294630 kubelet[2738]: E0213 08:21:25.294559 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:21:25.294820 kubelet[2738]: E0213 08:21:25.294583 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:21:25.294820 kubelet[2738]: E0213 08:21:25.294763 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:21:25.294820 kubelet[2738]: E0213 08:21:25.294778 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:21:25.294820 kubelet[2738]: E0213 08:21:25.294807 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:25.295021 env[1563]: time="2024-02-13T08:21:25.294620722Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:25.295070 kubelet[2738]: E0213 08:21:25.294832 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:21:25.451418 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41-shm.mount: Deactivated successfully. Feb 13 08:21:36.656081 env[1563]: time="2024-02-13T08:21:36.655966832Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:21:36.656081 env[1563]: time="2024-02-13T08:21:36.655971218Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:21:36.671677 env[1563]: time="2024-02-13T08:21:36.671640125Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:36.671869 env[1563]: time="2024-02-13T08:21:36.671670977Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:36.671898 kubelet[2738]: E0213 08:21:36.671883 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:21:36.671898 kubelet[2738]: E0213 08:21:36.671888 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:21:36.672117 kubelet[2738]: E0213 08:21:36.671909 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:21:36.672117 kubelet[2738]: E0213 08:21:36.671909 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:21:36.672117 kubelet[2738]: E0213 08:21:36.671930 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:36.672117 kubelet[2738]: E0213 08:21:36.671930 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:36.672117 kubelet[2738]: E0213 08:21:36.671947 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:21:36.672246 kubelet[2738]: E0213 08:21:36.671947 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:39.656685 env[1563]: time="2024-02-13T08:21:39.656626260Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:21:39.656685 env[1563]: time="2024-02-13T08:21:39.656655828Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:21:39.673098 env[1563]: time="2024-02-13T08:21:39.673035625Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:39.673201 env[1563]: time="2024-02-13T08:21:39.673174065Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:39.673231 kubelet[2738]: E0213 08:21:39.673204 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:21:39.673398 kubelet[2738]: E0213 08:21:39.673234 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:21:39.673398 kubelet[2738]: E0213 08:21:39.673256 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:39.673398 kubelet[2738]: E0213 08:21:39.673275 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:21:39.673398 kubelet[2738]: E0213 08:21:39.673310 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:21:39.673398 kubelet[2738]: E0213 08:21:39.673325 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:21:39.673542 kubelet[2738]: E0213 08:21:39.673345 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:39.673542 kubelet[2738]: E0213 08:21:39.673360 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:21:49.657162 env[1563]: time="2024-02-13T08:21:49.657032176Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:21:49.657162 env[1563]: time="2024-02-13T08:21:49.657063205Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:21:49.684126 env[1563]: time="2024-02-13T08:21:49.684087902Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:49.684207 env[1563]: time="2024-02-13T08:21:49.684123456Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:49.684295 kubelet[2738]: E0213 08:21:49.684256 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:21:49.684295 kubelet[2738]: E0213 08:21:49.684281 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:21:49.684492 kubelet[2738]: E0213 08:21:49.684302 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:49.684492 kubelet[2738]: E0213 08:21:49.684319 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:21:49.684492 kubelet[2738]: E0213 08:21:49.684256 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:21:49.684492 kubelet[2738]: E0213 08:21:49.684338 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:21:49.684609 kubelet[2738]: E0213 08:21:49.684357 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:49.684609 kubelet[2738]: E0213 08:21:49.684371 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:21:52.656086 env[1563]: time="2024-02-13T08:21:52.656042198Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:21:52.672834 env[1563]: time="2024-02-13T08:21:52.672755117Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:52.672927 kubelet[2738]: E0213 08:21:52.672910 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:21:52.673126 kubelet[2738]: E0213 08:21:52.672934 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:21:52.673126 kubelet[2738]: E0213 08:21:52.672956 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:52.673126 kubelet[2738]: E0213 08:21:52.672975 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:21:54.656196 env[1563]: time="2024-02-13T08:21:54.656139947Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:21:54.680197 env[1563]: time="2024-02-13T08:21:54.680163870Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:21:54.680367 kubelet[2738]: E0213 08:21:54.680327 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:21:54.680367 kubelet[2738]: E0213 08:21:54.680351 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:21:54.680553 kubelet[2738]: E0213 08:21:54.680372 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:21:54.680553 kubelet[2738]: E0213 08:21:54.680390 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:22:01.656848 env[1563]: time="2024-02-13T08:22:01.656763519Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:22:01.677287 env[1563]: time="2024-02-13T08:22:01.677224480Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:01.677433 kubelet[2738]: E0213 08:22:01.677383 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:22:01.677433 kubelet[2738]: E0213 08:22:01.677408 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:22:01.677433 kubelet[2738]: E0213 08:22:01.677429 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:01.677678 kubelet[2738]: E0213 08:22:01.677446 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:22:03.657488 env[1563]: time="2024-02-13T08:22:03.657356276Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:22:03.708793 env[1563]: time="2024-02-13T08:22:03.708695286Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:03.709024 kubelet[2738]: E0213 08:22:03.708981 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:22:03.709418 kubelet[2738]: E0213 08:22:03.709047 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:22:03.709418 kubelet[2738]: E0213 08:22:03.709099 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:03.709418 kubelet[2738]: E0213 08:22:03.709150 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:22:06.657345 env[1563]: time="2024-02-13T08:22:06.657235609Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:22:06.683369 env[1563]: time="2024-02-13T08:22:06.683305361Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:06.683512 kubelet[2738]: E0213 08:22:06.683471 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:22:06.683512 kubelet[2738]: E0213 08:22:06.683496 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:22:06.683711 kubelet[2738]: E0213 08:22:06.683517 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:06.683711 kubelet[2738]: E0213 08:22:06.683534 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:22:08.655808 env[1563]: time="2024-02-13T08:22:08.655749010Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:22:08.672763 env[1563]: time="2024-02-13T08:22:08.672730915Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:08.672925 kubelet[2738]: E0213 08:22:08.672915 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:22:08.673126 kubelet[2738]: E0213 08:22:08.672939 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:22:08.673126 kubelet[2738]: E0213 08:22:08.672961 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:08.673126 kubelet[2738]: E0213 08:22:08.672978 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:22:14.656741 env[1563]: time="2024-02-13T08:22:14.656696695Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:22:14.673166 env[1563]: time="2024-02-13T08:22:14.673102164Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:14.673302 kubelet[2738]: E0213 08:22:14.673252 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:22:14.673302 kubelet[2738]: E0213 08:22:14.673276 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:22:14.673302 kubelet[2738]: E0213 08:22:14.673300 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:14.673530 kubelet[2738]: E0213 08:22:14.673317 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:22:16.656905 env[1563]: time="2024-02-13T08:22:16.656752241Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:22:16.712528 env[1563]: time="2024-02-13T08:22:16.712448961Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:16.712792 kubelet[2738]: E0213 08:22:16.712768 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:22:16.713185 kubelet[2738]: E0213 08:22:16.712834 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:22:16.713185 kubelet[2738]: E0213 08:22:16.712878 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:16.713185 kubelet[2738]: E0213 08:22:16.712912 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:22:18.656413 env[1563]: time="2024-02-13T08:22:18.656376626Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:22:18.673241 env[1563]: time="2024-02-13T08:22:18.673180500Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:18.673398 kubelet[2738]: E0213 08:22:18.673344 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:22:18.673398 kubelet[2738]: E0213 08:22:18.673368 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:22:18.673398 kubelet[2738]: E0213 08:22:18.673390 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:18.673630 kubelet[2738]: E0213 08:22:18.673407 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:22:21.656370 env[1563]: time="2024-02-13T08:22:21.656308372Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:22:21.672240 env[1563]: time="2024-02-13T08:22:21.672200617Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:21.672419 kubelet[2738]: E0213 08:22:21.672376 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:22:21.672419 kubelet[2738]: E0213 08:22:21.672403 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:22:21.672605 kubelet[2738]: E0213 08:22:21.672424 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:21.672605 kubelet[2738]: E0213 08:22:21.672443 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:22:25.655906 env[1563]: time="2024-02-13T08:22:25.655872011Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:22:25.671335 env[1563]: time="2024-02-13T08:22:25.671297988Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:25.671514 kubelet[2738]: E0213 08:22:25.671476 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:22:25.671514 kubelet[2738]: E0213 08:22:25.671502 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:22:25.671723 kubelet[2738]: E0213 08:22:25.671526 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:25.671723 kubelet[2738]: E0213 08:22:25.671545 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:22:27.656860 env[1563]: time="2024-02-13T08:22:27.656791497Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:22:27.674694 env[1563]: time="2024-02-13T08:22:27.674661233Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:27.674831 kubelet[2738]: E0213 08:22:27.674820 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:22:27.674999 kubelet[2738]: E0213 08:22:27.674845 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:22:27.674999 kubelet[2738]: E0213 08:22:27.674867 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:27.674999 kubelet[2738]: E0213 08:22:27.674884 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:22:30.657670 env[1563]: time="2024-02-13T08:22:30.657562328Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:22:30.707182 env[1563]: time="2024-02-13T08:22:30.707091591Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:30.707385 kubelet[2738]: E0213 08:22:30.707335 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:22:30.707385 kubelet[2738]: E0213 08:22:30.707377 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:22:30.707762 kubelet[2738]: E0213 08:22:30.707420 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:30.707762 kubelet[2738]: E0213 08:22:30.707454 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:22:32.657068 env[1563]: time="2024-02-13T08:22:32.656906227Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:22:32.674807 env[1563]: time="2024-02-13T08:22:32.674751091Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:32.674934 kubelet[2738]: E0213 08:22:32.674921 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:22:32.675126 kubelet[2738]: E0213 08:22:32.674953 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:22:32.675126 kubelet[2738]: E0213 08:22:32.674974 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:32.675126 kubelet[2738]: E0213 08:22:32.675000 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:22:39.656366 env[1563]: time="2024-02-13T08:22:39.656331280Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:22:39.672880 env[1563]: time="2024-02-13T08:22:39.672820118Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:39.672975 kubelet[2738]: E0213 08:22:39.672963 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:22:39.673149 kubelet[2738]: E0213 08:22:39.672995 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:22:39.673149 kubelet[2738]: E0213 08:22:39.673018 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:39.673149 kubelet[2738]: E0213 08:22:39.673035 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:22:41.656965 env[1563]: time="2024-02-13T08:22:41.656872600Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:22:41.684077 env[1563]: time="2024-02-13T08:22:41.683998918Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:41.684306 kubelet[2738]: E0213 08:22:41.684266 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:22:41.684306 kubelet[2738]: E0213 08:22:41.684291 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:22:41.684486 kubelet[2738]: E0213 08:22:41.684311 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:41.684486 kubelet[2738]: E0213 08:22:41.684330 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:22:45.656083 env[1563]: time="2024-02-13T08:22:45.656011225Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:22:45.672819 env[1563]: time="2024-02-13T08:22:45.672763798Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:45.672940 kubelet[2738]: E0213 08:22:45.672928 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:22:45.673118 kubelet[2738]: E0213 08:22:45.672955 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:22:45.673118 kubelet[2738]: E0213 08:22:45.672976 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:45.673118 kubelet[2738]: E0213 08:22:45.672998 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:22:47.656070 env[1563]: time="2024-02-13T08:22:47.656009881Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:22:47.672375 env[1563]: time="2024-02-13T08:22:47.672315915Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:47.672530 kubelet[2738]: E0213 08:22:47.672486 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:22:47.672530 kubelet[2738]: E0213 08:22:47.672511 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:22:47.672530 kubelet[2738]: E0213 08:22:47.672533 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:47.672773 kubelet[2738]: E0213 08:22:47.672551 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:22:51.656340 env[1563]: time="2024-02-13T08:22:51.656278799Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:22:51.673170 env[1563]: time="2024-02-13T08:22:51.673108108Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:51.673317 kubelet[2738]: E0213 08:22:51.673271 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:22:51.673317 kubelet[2738]: E0213 08:22:51.673296 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:22:51.673317 kubelet[2738]: E0213 08:22:51.673317 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:51.673546 kubelet[2738]: E0213 08:22:51.673337 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:22:52.656408 env[1563]: time="2024-02-13T08:22:52.656374093Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:22:52.672978 env[1563]: time="2024-02-13T08:22:52.672922576Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:52.673180 kubelet[2738]: E0213 08:22:52.673134 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:22:52.673180 kubelet[2738]: E0213 08:22:52.673158 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:22:52.673180 kubelet[2738]: E0213 08:22:52.673178 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:52.673283 kubelet[2738]: E0213 08:22:52.673195 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:22:58.656814 env[1563]: time="2024-02-13T08:22:58.656777152Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:22:58.679376 env[1563]: time="2024-02-13T08:22:58.679287710Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:58.679534 kubelet[2738]: E0213 08:22:58.679513 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:22:58.679874 kubelet[2738]: E0213 08:22:58.679556 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:22:58.679874 kubelet[2738]: E0213 08:22:58.679605 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:58.679874 kubelet[2738]: E0213 08:22:58.679642 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:22:59.656629 env[1563]: time="2024-02-13T08:22:59.656502660Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:22:59.682627 env[1563]: time="2024-02-13T08:22:59.682594013Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:22:59.682842 kubelet[2738]: E0213 08:22:59.682751 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:22:59.682842 kubelet[2738]: E0213 08:22:59.682775 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:22:59.682842 kubelet[2738]: E0213 08:22:59.682796 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:22:59.682842 kubelet[2738]: E0213 08:22:59.682814 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:23:05.657219 env[1563]: time="2024-02-13T08:23:05.656967601Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:23:05.711026 env[1563]: time="2024-02-13T08:23:05.710881359Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:05.711285 kubelet[2738]: E0213 08:23:05.711241 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:23:05.711819 kubelet[2738]: E0213 08:23:05.711314 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:23:05.711819 kubelet[2738]: E0213 08:23:05.711396 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:05.711819 kubelet[2738]: E0213 08:23:05.711461 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:23:06.657496 env[1563]: time="2024-02-13T08:23:06.657390395Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:23:06.708950 env[1563]: time="2024-02-13T08:23:06.708889283Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:06.709183 kubelet[2738]: E0213 08:23:06.709162 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:23:06.709270 kubelet[2738]: E0213 08:23:06.709205 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:23:06.709270 kubelet[2738]: E0213 08:23:06.709250 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:06.709418 kubelet[2738]: E0213 08:23:06.709284 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:23:11.657216 env[1563]: time="2024-02-13T08:23:11.657086063Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:23:11.708312 env[1563]: time="2024-02-13T08:23:11.708226660Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:11.708452 kubelet[2738]: E0213 08:23:11.708433 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:23:11.708762 kubelet[2738]: E0213 08:23:11.708476 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:23:11.708762 kubelet[2738]: E0213 08:23:11.708520 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:11.708762 kubelet[2738]: E0213 08:23:11.708554 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:23:14.656222 env[1563]: time="2024-02-13T08:23:14.656184894Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:23:14.672460 env[1563]: time="2024-02-13T08:23:14.672399144Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:14.672552 kubelet[2738]: E0213 08:23:14.672534 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:23:14.672720 kubelet[2738]: E0213 08:23:14.672559 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:23:14.672720 kubelet[2738]: E0213 08:23:14.672584 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:14.672720 kubelet[2738]: E0213 08:23:14.672603 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:23:18.656712 env[1563]: time="2024-02-13T08:23:18.656666643Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:23:18.656712 env[1563]: time="2024-02-13T08:23:18.656661763Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:23:18.673390 env[1563]: time="2024-02-13T08:23:18.673322920Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:18.673524 env[1563]: time="2024-02-13T08:23:18.673324337Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:18.673548 kubelet[2738]: E0213 08:23:18.673531 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:23:18.673732 kubelet[2738]: E0213 08:23:18.673558 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:23:18.673732 kubelet[2738]: E0213 08:23:18.673581 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:18.673732 kubelet[2738]: E0213 08:23:18.673599 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:23:18.673732 kubelet[2738]: E0213 08:23:18.673531 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:23:18.673732 kubelet[2738]: E0213 08:23:18.673621 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:23:18.673854 kubelet[2738]: E0213 08:23:18.673639 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:18.673854 kubelet[2738]: E0213 08:23:18.673654 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:23:22.621811 systemd[1]: Started sshd@7-145.40.67.79:22-139.178.68.195:41810.service. Feb 13 08:23:22.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.79:22-139.178.68.195:41810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:22.656623 env[1563]: time="2024-02-13T08:23:22.656578609Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:23:22.669320 env[1563]: time="2024-02-13T08:23:22.669285401Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:22.669583 kubelet[2738]: E0213 08:23:22.669542 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:23:22.669583 kubelet[2738]: E0213 08:23:22.669568 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:23:22.669776 kubelet[2738]: E0213 08:23:22.669589 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:22.669776 kubelet[2738]: E0213 08:23:22.669606 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:23:22.714001 kernel: audit: type=1130 audit(1707812602.621:281): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.79:22-139.178.68.195:41810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:22.735022 sshd[4985]: Accepted publickey for core from 139.178.68.195 port 41810 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:22.733000 audit[4985]: USER_ACCT pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:22.736283 sshd[4985]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:22.738706 systemd-logind[1548]: New session 10 of user core. Feb 13 08:23:22.739150 systemd[1]: Started session-10.scope. Feb 13 08:23:22.735000 audit[4985]: CRED_ACQ pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:22.920181 kernel: audit: type=1101 audit(1707812602.733:282): pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:22.920226 kernel: audit: type=1103 audit(1707812602.735:283): pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:22.920271 kernel: audit: type=1006 audit(1707812602.735:284): pid=4985 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Feb 13 08:23:22.735000 audit[4985]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4b6c15f0 a2=3 a3=0 items=0 ppid=1 pid=4985 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:23.073126 kernel: audit: type=1300 audit(1707812602.735:284): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4b6c15f0 a2=3 a3=0 items=0 ppid=1 pid=4985 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:23.073208 kernel: audit: type=1327 audit(1707812602.735:284): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:22.735000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:23.081832 sshd[4985]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:23.083576 systemd[1]: sshd@7-145.40.67.79:22-139.178.68.195:41810.service: Deactivated successfully. Feb 13 08:23:23.084217 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 08:23:23.084244 systemd-logind[1548]: Session 10 logged out. Waiting for processes to exit. Feb 13 08:23:23.084961 systemd-logind[1548]: Removed session 10. Feb 13 08:23:22.740000 audit[4985]: USER_START pid=4985 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.199986 kernel: audit: type=1105 audit(1707812602.740:285): pid=4985 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.200024 kernel: audit: type=1103 audit(1707812602.740:286): pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:22.740000 audit[5018]: CRED_ACQ pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.082000 audit[4985]: USER_END pid=4985 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.387705 kernel: audit: type=1106 audit(1707812603.082:287): pid=4985 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.387739 kernel: audit: type=1104 audit(1707812603.082:288): pid=4985 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.082000 audit[4985]: CRED_DISP pid=4985 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:23.083000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.79:22-139.178.68.195:41810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:28.089345 systemd[1]: Started sshd@8-145.40.67.79:22-139.178.68.195:36796.service. Feb 13 08:23:28.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.67.79:22-139.178.68.195:36796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:28.117066 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:28.117134 kernel: audit: type=1130 audit(1707812608.087:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.67.79:22-139.178.68.195:36796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:28.228000 audit[5047]: USER_ACCT pid=5047 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.229362 sshd[5047]: Accepted publickey for core from 139.178.68.195 port 36796 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:28.232284 sshd[5047]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:28.234699 systemd-logind[1548]: New session 11 of user core. Feb 13 08:23:28.235126 systemd[1]: Started session-11.scope. Feb 13 08:23:28.315022 sshd[5047]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:28.316472 systemd[1]: sshd@8-145.40.67.79:22-139.178.68.195:36796.service: Deactivated successfully. Feb 13 08:23:28.317002 systemd-logind[1548]: Session 11 logged out. Waiting for processes to exit. Feb 13 08:23:28.317041 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 08:23:28.317586 systemd-logind[1548]: Removed session 11. Feb 13 08:23:28.230000 audit[5047]: CRED_ACQ pid=5047 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.413648 kernel: audit: type=1101 audit(1707812608.228:291): pid=5047 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.413691 kernel: audit: type=1103 audit(1707812608.230:292): pid=5047 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.413708 kernel: audit: type=1006 audit(1707812608.230:293): pid=5047 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Feb 13 08:23:28.472797 kernel: audit: type=1300 audit(1707812608.230:293): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd8e88af20 a2=3 a3=0 items=0 ppid=1 pid=5047 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:28.230000 audit[5047]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd8e88af20 a2=3 a3=0 items=0 ppid=1 pid=5047 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:28.565676 kernel: audit: type=1327 audit(1707812608.230:293): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:28.230000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:28.596459 kernel: audit: type=1105 audit(1707812608.235:294): pid=5047 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.235000 audit[5047]: USER_START pid=5047 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.656530 env[1563]: time="2024-02-13T08:23:28.656507595Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:23:28.669195 env[1563]: time="2024-02-13T08:23:28.669157879Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:28.669368 kubelet[2738]: E0213 08:23:28.669326 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:23:28.669368 kubelet[2738]: E0213 08:23:28.669352 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:23:28.669583 kubelet[2738]: E0213 08:23:28.669376 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:28.669583 kubelet[2738]: E0213 08:23:28.669394 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:23:28.235000 audit[5050]: CRED_ACQ pid=5050 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.781081 kernel: audit: type=1103 audit(1707812608.235:295): pid=5050 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.781110 kernel: audit: type=1106 audit(1707812608.313:296): pid=5047 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.313000 audit[5047]: USER_END pid=5047 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.876554 kernel: audit: type=1104 audit(1707812608.313:297): pid=5047 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.313000 audit[5047]: CRED_DISP pid=5047 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:28.314000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.67.79:22-139.178.68.195:36796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:29.655986 env[1563]: time="2024-02-13T08:23:29.655950443Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:23:29.676418 env[1563]: time="2024-02-13T08:23:29.676330684Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:29.676789 kubelet[2738]: E0213 08:23:29.676561 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:23:29.676789 kubelet[2738]: E0213 08:23:29.676600 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:23:29.676789 kubelet[2738]: E0213 08:23:29.676642 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:29.676789 kubelet[2738]: E0213 08:23:29.676676 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:23:33.322447 systemd[1]: Started sshd@9-145.40.67.79:22-139.178.68.195:36800.service. Feb 13 08:23:33.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.79:22-139.178.68.195:36800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:33.350285 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:33.350379 kernel: audit: type=1130 audit(1707812613.321:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.79:22-139.178.68.195:36800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:33.458000 audit[5134]: USER_ACCT pid=5134 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.459659 sshd[5134]: Accepted publickey for core from 139.178.68.195 port 36800 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:33.462305 sshd[5134]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:33.464640 systemd-logind[1548]: New session 12 of user core. Feb 13 08:23:33.465116 systemd[1]: Started session-12.scope. Feb 13 08:23:33.544264 sshd[5134]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:33.545611 systemd[1]: sshd@9-145.40.67.79:22-139.178.68.195:36800.service: Deactivated successfully. Feb 13 08:23:33.546201 systemd-logind[1548]: Session 12 logged out. Waiting for processes to exit. Feb 13 08:23:33.546222 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 08:23:33.546905 systemd-logind[1548]: Removed session 12. Feb 13 08:23:33.461000 audit[5134]: CRED_ACQ pid=5134 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.641278 kernel: audit: type=1101 audit(1707812613.458:300): pid=5134 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.641342 kernel: audit: type=1103 audit(1707812613.461:301): pid=5134 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.641358 kernel: audit: type=1006 audit(1707812613.461:302): pid=5134 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Feb 13 08:23:33.655898 env[1563]: time="2024-02-13T08:23:33.655877356Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:23:33.656104 env[1563]: time="2024-02-13T08:23:33.655875730Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:23:33.667832 env[1563]: time="2024-02-13T08:23:33.667774598Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:33.667947 kubelet[2738]: E0213 08:23:33.667937 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:23:33.668116 kubelet[2738]: E0213 08:23:33.667962 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:23:33.668116 kubelet[2738]: E0213 08:23:33.667983 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:33.668116 kubelet[2738]: E0213 08:23:33.668007 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:23:33.671936 env[1563]: time="2024-02-13T08:23:33.671885442Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:33.672101 kubelet[2738]: E0213 08:23:33.672063 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:23:33.672101 kubelet[2738]: E0213 08:23:33.672078 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:23:33.672101 kubelet[2738]: E0213 08:23:33.672096 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:33.672205 kubelet[2738]: E0213 08:23:33.672115 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:23:33.699846 kernel: audit: type=1300 audit(1707812613.461:302): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4900b520 a2=3 a3=0 items=0 ppid=1 pid=5134 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:33.461000 audit[5134]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4900b520 a2=3 a3=0 items=0 ppid=1 pid=5134 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:33.791826 kernel: audit: type=1327 audit(1707812613.461:302): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:33.461000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:33.822245 kernel: audit: type=1105 audit(1707812613.466:303): pid=5134 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.466000 audit[5134]: USER_START pid=5134 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.467000 audit[5137]: CRED_ACQ pid=5137 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:34.005887 kernel: audit: type=1103 audit(1707812613.467:304): pid=5137 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:34.005943 kernel: audit: type=1106 audit(1707812613.543:305): pid=5134 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.543000 audit[5134]: USER_END pid=5134 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.543000 audit[5134]: CRED_DISP pid=5134 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:34.190554 kernel: audit: type=1104 audit(1707812613.543:306): pid=5134 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:33.544000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.79:22-139.178.68.195:36800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:38.550209 systemd[1]: Started sshd@10-145.40.67.79:22-139.178.68.195:36238.service. Feb 13 08:23:38.549000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.67.79:22-139.178.68.195:36238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:38.576650 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:38.576707 kernel: audit: type=1130 audit(1707812618.549:308): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.67.79:22-139.178.68.195:36238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:38.684000 audit[5220]: USER_ACCT pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.685821 sshd[5220]: Accepted publickey for core from 139.178.68.195 port 36238 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:38.689013 sshd[5220]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:38.693744 systemd-logind[1548]: New session 13 of user core. Feb 13 08:23:38.694191 systemd[1]: Started session-13.scope. Feb 13 08:23:38.687000 audit[5220]: CRED_ACQ pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.780588 sshd[5220]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:38.781998 systemd[1]: sshd@10-145.40.67.79:22-139.178.68.195:36238.service: Deactivated successfully. Feb 13 08:23:38.782645 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 08:23:38.782665 systemd-logind[1548]: Session 13 logged out. Waiting for processes to exit. Feb 13 08:23:38.783441 systemd-logind[1548]: Removed session 13. Feb 13 08:23:38.867309 kernel: audit: type=1101 audit(1707812618.684:309): pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.867345 kernel: audit: type=1103 audit(1707812618.687:310): pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.867360 kernel: audit: type=1006 audit(1707812618.687:311): pid=5220 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Feb 13 08:23:38.925753 kernel: audit: type=1300 audit(1707812618.687:311): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd924a2bc0 a2=3 a3=0 items=0 ppid=1 pid=5220 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:38.687000 audit[5220]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd924a2bc0 a2=3 a3=0 items=0 ppid=1 pid=5220 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:39.017691 kernel: audit: type=1327 audit(1707812618.687:311): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:38.687000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:39.048109 kernel: audit: type=1105 audit(1707812618.695:312): pid=5220 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.695000 audit[5220]: USER_START pid=5220 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.696000 audit[5223]: CRED_ACQ pid=5223 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:39.231587 kernel: audit: type=1103 audit(1707812618.696:313): pid=5223 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:39.231617 kernel: audit: type=1106 audit(1707812618.780:314): pid=5220 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.780000 audit[5220]: USER_END pid=5220 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:39.326974 kernel: audit: type=1104 audit(1707812618.780:315): pid=5220 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.780000 audit[5220]: CRED_DISP pid=5220 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:38.781000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.67.79:22-139.178.68.195:36238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:40.657755 env[1563]: time="2024-02-13T08:23:40.657662878Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:23:40.683462 env[1563]: time="2024-02-13T08:23:40.683399109Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:40.683641 kubelet[2738]: E0213 08:23:40.683599 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:23:40.683808 kubelet[2738]: E0213 08:23:40.683654 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:23:40.683808 kubelet[2738]: E0213 08:23:40.683674 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:40.683808 kubelet[2738]: E0213 08:23:40.683690 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:23:41.656751 env[1563]: time="2024-02-13T08:23:41.656685035Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:23:41.673340 env[1563]: time="2024-02-13T08:23:41.673275181Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:41.673607 kubelet[2738]: E0213 08:23:41.673467 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:23:41.673607 kubelet[2738]: E0213 08:23:41.673497 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:23:41.673607 kubelet[2738]: E0213 08:23:41.673518 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:41.673607 kubelet[2738]: E0213 08:23:41.673535 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:23:43.787641 systemd[1]: Started sshd@11-145.40.67.79:22-139.178.68.195:36244.service. Feb 13 08:23:43.786000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.67.79:22-139.178.68.195:36244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:43.814563 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:43.814624 kernel: audit: type=1130 audit(1707812623.786:317): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.67.79:22-139.178.68.195:36244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:43.923000 audit[5305]: USER_ACCT pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:43.924834 sshd[5305]: Accepted publickey for core from 139.178.68.195 port 36244 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:43.925932 sshd[5305]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:43.928321 systemd-logind[1548]: New session 14 of user core. Feb 13 08:23:43.928732 systemd[1]: Started session-14.scope. Feb 13 08:23:44.010322 sshd[5305]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:44.011784 systemd[1]: sshd@11-145.40.67.79:22-139.178.68.195:36244.service: Deactivated successfully. Feb 13 08:23:44.012376 systemd-logind[1548]: Session 14 logged out. Waiting for processes to exit. Feb 13 08:23:44.012380 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 08:23:44.012829 systemd-logind[1548]: Removed session 14. Feb 13 08:23:43.924000 audit[5305]: CRED_ACQ pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.108350 kernel: audit: type=1101 audit(1707812623.923:318): pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.108387 kernel: audit: type=1103 audit(1707812623.924:319): pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.108406 kernel: audit: type=1006 audit(1707812623.924:320): pid=5305 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Feb 13 08:23:44.166795 kernel: audit: type=1300 audit(1707812623.924:320): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4fdc2710 a2=3 a3=0 items=0 ppid=1 pid=5305 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:43.924000 audit[5305]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4fdc2710 a2=3 a3=0 items=0 ppid=1 pid=5305 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:44.258682 kernel: audit: type=1327 audit(1707812623.924:320): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:43.924000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:44.289096 kernel: audit: type=1105 audit(1707812623.929:321): pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:43.929000 audit[5305]: USER_START pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.383392 kernel: audit: type=1103 audit(1707812623.930:322): pid=5308 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:43.930000 audit[5308]: CRED_ACQ pid=5308 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.472429 kernel: audit: type=1106 audit(1707812624.009:323): pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.009000 audit[5305]: USER_END pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.567764 kernel: audit: type=1104 audit(1707812624.010:324): pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.010000 audit[5305]: CRED_DISP pid=5305 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:44.010000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.67.79:22-139.178.68.195:36244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:45.656921 env[1563]: time="2024-02-13T08:23:45.656837922Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:23:45.705832 env[1563]: time="2024-02-13T08:23:45.705766536Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:45.706081 kubelet[2738]: E0213 08:23:45.706046 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:23:45.706505 kubelet[2738]: E0213 08:23:45.706107 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:23:45.706505 kubelet[2738]: E0213 08:23:45.706159 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:45.706505 kubelet[2738]: E0213 08:23:45.706202 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:23:46.656669 env[1563]: time="2024-02-13T08:23:46.656607187Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:23:46.673191 env[1563]: time="2024-02-13T08:23:46.673158646Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:46.673412 kubelet[2738]: E0213 08:23:46.673328 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:23:46.673412 kubelet[2738]: E0213 08:23:46.673353 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:23:46.673412 kubelet[2738]: E0213 08:23:46.673379 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:46.673412 kubelet[2738]: E0213 08:23:46.673396 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:23:49.016173 systemd[1]: Started sshd@12-145.40.67.79:22-139.178.68.195:56864.service. Feb 13 08:23:49.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.67.79:22-139.178.68.195:56864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:49.042958 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:49.043036 kernel: audit: type=1130 audit(1707812629.015:326): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.67.79:22-139.178.68.195:56864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:49.151000 audit[5390]: USER_ACCT pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.152997 sshd[5390]: Accepted publickey for core from 139.178.68.195 port 56864 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:49.154284 sshd[5390]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:49.156693 systemd-logind[1548]: New session 15 of user core. Feb 13 08:23:49.157126 systemd[1]: Started session-15.scope. Feb 13 08:23:49.238874 sshd[5390]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:49.240543 systemd[1]: sshd@12-145.40.67.79:22-139.178.68.195:56864.service: Deactivated successfully. Feb 13 08:23:49.241193 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 08:23:49.241255 systemd-logind[1548]: Session 15 logged out. Waiting for processes to exit. Feb 13 08:23:49.241906 systemd-logind[1548]: Removed session 15. Feb 13 08:23:49.153000 audit[5390]: CRED_ACQ pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.335394 kernel: audit: type=1101 audit(1707812629.151:327): pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.335448 kernel: audit: type=1103 audit(1707812629.153:328): pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.335467 kernel: audit: type=1006 audit(1707812629.153:329): pid=5390 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Feb 13 08:23:49.393851 kernel: audit: type=1300 audit(1707812629.153:329): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef7a06960 a2=3 a3=0 items=0 ppid=1 pid=5390 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:49.153000 audit[5390]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef7a06960 a2=3 a3=0 items=0 ppid=1 pid=5390 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:49.485750 kernel: audit: type=1327 audit(1707812629.153:329): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:49.153000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:49.516155 kernel: audit: type=1105 audit(1707812629.159:330): pid=5390 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.159000 audit[5390]: USER_START pid=5390 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.610452 kernel: audit: type=1103 audit(1707812629.159:331): pid=5393 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.159000 audit[5393]: CRED_ACQ pid=5393 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.699585 kernel: audit: type=1106 audit(1707812629.238:332): pid=5390 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.238000 audit[5390]: USER_END pid=5390 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.794974 kernel: audit: type=1104 audit(1707812629.238:333): pid=5390 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.238000 audit[5390]: CRED_DISP pid=5390 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:49.239000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.67.79:22-139.178.68.195:56864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:54.247276 systemd[1]: Started sshd@13-145.40.67.79:22-139.178.68.195:56870.service. Feb 13 08:23:54.246000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.79:22-139.178.68.195:56870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:54.274811 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:54.274892 kernel: audit: type=1130 audit(1707812634.246:335): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.79:22-139.178.68.195:56870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:54.383000 audit[5418]: USER_ACCT pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.384824 sshd[5418]: Accepted publickey for core from 139.178.68.195 port 56870 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:54.386328 sshd[5418]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:54.388949 systemd-logind[1548]: New session 16 of user core. Feb 13 08:23:54.389472 systemd[1]: Started session-16.scope. Feb 13 08:23:54.473644 sshd[5418]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:54.475296 systemd[1]: sshd@13-145.40.67.79:22-139.178.68.195:56870.service: Deactivated successfully. Feb 13 08:23:54.385000 audit[5418]: CRED_ACQ pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.475937 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 08:23:54.475957 systemd-logind[1548]: Session 16 logged out. Waiting for processes to exit. Feb 13 08:23:54.476462 systemd-logind[1548]: Removed session 16. Feb 13 08:23:54.566458 kernel: audit: type=1101 audit(1707812634.383:336): pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.566500 kernel: audit: type=1103 audit(1707812634.385:337): pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.566516 kernel: audit: type=1006 audit(1707812634.385:338): pid=5418 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Feb 13 08:23:54.624966 kernel: audit: type=1300 audit(1707812634.385:338): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff01703ef0 a2=3 a3=0 items=0 ppid=1 pid=5418 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:54.385000 audit[5418]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff01703ef0 a2=3 a3=0 items=0 ppid=1 pid=5418 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:54.716855 kernel: audit: type=1327 audit(1707812634.385:338): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:54.385000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:54.747359 kernel: audit: type=1105 audit(1707812634.391:339): pid=5418 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.391000 audit[5418]: USER_START pid=5418 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.841741 kernel: audit: type=1103 audit(1707812634.392:340): pid=5421 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.392000 audit[5421]: CRED_ACQ pid=5421 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.930845 kernel: audit: type=1106 audit(1707812634.473:341): pid=5418 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.473000 audit[5418]: USER_END pid=5418 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:55.026331 kernel: audit: type=1104 audit(1707812634.473:342): pid=5418 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.473000 audit[5418]: CRED_DISP pid=5418 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:54.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.79:22-139.178.68.195:56870 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:55.656468 env[1563]: time="2024-02-13T08:23:55.656426972Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:23:55.656468 env[1563]: time="2024-02-13T08:23:55.656446812Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:23:55.673278 env[1563]: time="2024-02-13T08:23:55.673214335Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:55.673278 env[1563]: time="2024-02-13T08:23:55.673267123Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:55.673440 kubelet[2738]: E0213 08:23:55.673396 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:23:55.673440 kubelet[2738]: E0213 08:23:55.673423 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:23:55.673693 kubelet[2738]: E0213 08:23:55.673446 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:55.673693 kubelet[2738]: E0213 08:23:55.673396 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:23:55.673693 kubelet[2738]: E0213 08:23:55.673467 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:23:55.673693 kubelet[2738]: E0213 08:23:55.673486 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:23:55.673819 kubelet[2738]: E0213 08:23:55.673520 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:55.673819 kubelet[2738]: E0213 08:23:55.673547 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:23:57.656458 env[1563]: time="2024-02-13T08:23:57.656417196Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:23:57.656885 env[1563]: time="2024-02-13T08:23:57.656627110Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:23:57.673277 env[1563]: time="2024-02-13T08:23:57.673239378Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:57.673382 env[1563]: time="2024-02-13T08:23:57.673309241Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:23:57.673410 kubelet[2738]: E0213 08:23:57.673393 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:23:57.673410 kubelet[2738]: E0213 08:23:57.673405 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:23:57.673626 kubelet[2738]: E0213 08:23:57.673421 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:23:57.673626 kubelet[2738]: E0213 08:23:57.673421 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:23:57.673626 kubelet[2738]: E0213 08:23:57.673444 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:57.673626 kubelet[2738]: E0213 08:23:57.673444 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:23:57.673626 kubelet[2738]: E0213 08:23:57.673462 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:23:57.673771 kubelet[2738]: E0213 08:23:57.673464 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:23:59.479963 systemd[1]: Started sshd@14-145.40.67.79:22-139.178.68.195:35362.service. Feb 13 08:23:59.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.67.79:22-139.178.68.195:35362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:59.507033 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:23:59.507085 kernel: audit: type=1130 audit(1707812639.479:344): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.67.79:22-139.178.68.195:35362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:59.616000 audit[5561]: USER_ACCT pid=5561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.617615 sshd[5561]: Accepted publickey for core from 139.178.68.195 port 35362 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:59.619766 sshd[5561]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:59.624429 systemd-logind[1548]: New session 17 of user core. Feb 13 08:23:59.624881 systemd[1]: Started session-17.scope. Feb 13 08:23:59.708458 sshd[5561]: pam_unix(sshd:session): session closed for user core Feb 13 08:23:59.710098 systemd[1]: Started sshd@15-145.40.67.79:22-139.178.68.195:35372.service. Feb 13 08:23:59.618000 audit[5561]: CRED_ACQ pid=5561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.710718 systemd[1]: sshd@14-145.40.67.79:22-139.178.68.195:35362.service: Deactivated successfully. Feb 13 08:23:59.711247 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 08:23:59.711264 systemd-logind[1548]: Session 17 logged out. Waiting for processes to exit. Feb 13 08:23:59.711739 systemd-logind[1548]: Removed session 17. Feb 13 08:23:59.801042 kernel: audit: type=1101 audit(1707812639.616:345): pid=5561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.801084 kernel: audit: type=1103 audit(1707812639.618:346): pid=5561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.801100 kernel: audit: type=1006 audit(1707812639.618:347): pid=5561 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Feb 13 08:23:59.859576 kernel: audit: type=1300 audit(1707812639.618:347): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffeffdbb10 a2=3 a3=0 items=0 ppid=1 pid=5561 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:59.618000 audit[5561]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffeffdbb10 a2=3 a3=0 items=0 ppid=1 pid=5561 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:59.879909 sshd[5586]: Accepted publickey for core from 139.178.68.195 port 35372 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:23:59.881286 sshd[5586]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:23:59.883590 systemd-logind[1548]: New session 18 of user core. Feb 13 08:23:59.884049 systemd[1]: Started session-18.scope. Feb 13 08:23:59.951444 kernel: audit: type=1327 audit(1707812639.618:347): proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:59.618000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:59.981863 kernel: audit: type=1105 audit(1707812639.627:348): pid=5561 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.627000 audit[5561]: USER_START pid=5561 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.076258 kernel: audit: type=1103 audit(1707812639.627:349): pid=5564 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.627000 audit[5564]: CRED_ACQ pid=5564 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.165330 kernel: audit: type=1106 audit(1707812639.708:350): pid=5561 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.708000 audit[5561]: USER_END pid=5561 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.260744 kernel: audit: type=1104 audit(1707812639.709:351): pid=5561 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.709000 audit[5561]: CRED_DISP pid=5561 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.709000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.67.79:22-139.178.68.195:35372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:59.709000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.67.79:22-139.178.68.195:35362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:23:59.878000 audit[5586]: USER_ACCT pid=5586 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.880000 audit[5586]: CRED_ACQ pid=5586 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.880000 audit[5586]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe956dd9e0 a2=3 a3=0 items=0 ppid=1 pid=5586 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:23:59.880000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:23:59.886000 audit[5586]: USER_START pid=5586 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:23:59.886000 audit[5592]: CRED_ACQ pid=5592 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.416418 sshd[5586]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:00.415000 audit[5586]: USER_END pid=5586 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.416000 audit[5586]: CRED_DISP pid=5586 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.418001 systemd[1]: Started sshd@16-145.40.67.79:22-139.178.68.195:35386.service. Feb 13 08:24:00.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.67.79:22-139.178.68.195:35386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:00.418319 systemd[1]: sshd@15-145.40.67.79:22-139.178.68.195:35372.service: Deactivated successfully. Feb 13 08:24:00.417000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.67.79:22-139.178.68.195:35372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:00.418860 systemd-logind[1548]: Session 18 logged out. Waiting for processes to exit. Feb 13 08:24:00.418892 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 08:24:00.419369 systemd-logind[1548]: Removed session 18. Feb 13 08:24:00.443000 audit[5613]: USER_ACCT pid=5613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.444414 sshd[5613]: Accepted publickey for core from 139.178.68.195 port 35386 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:00.443000 audit[5613]: CRED_ACQ pid=5613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.443000 audit[5613]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd14dfcd0 a2=3 a3=0 items=0 ppid=1 pid=5613 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:00.443000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:00.445134 sshd[5613]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:00.447574 systemd-logind[1548]: New session 19 of user core. Feb 13 08:24:00.448057 systemd[1]: Started session-19.scope. Feb 13 08:24:00.449000 audit[5613]: USER_START pid=5613 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.449000 audit[5619]: CRED_ACQ pid=5619 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.575988 sshd[5613]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:00.575000 audit[5613]: USER_END pid=5613 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.575000 audit[5613]: CRED_DISP pid=5613 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:00.577677 systemd[1]: sshd@16-145.40.67.79:22-139.178.68.195:35386.service: Deactivated successfully. Feb 13 08:24:00.576000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.67.79:22-139.178.68.195:35386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:00.578416 systemd-logind[1548]: Session 19 logged out. Waiting for processes to exit. Feb 13 08:24:00.578436 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 08:24:00.578986 systemd-logind[1548]: Removed session 19. Feb 13 08:24:05.584021 systemd[1]: Started sshd@17-145.40.67.79:22-139.178.68.195:35398.service. Feb 13 08:24:05.583000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.67.79:22-139.178.68.195:35398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:05.611384 kernel: kauditd_printk_skb: 23 callbacks suppressed Feb 13 08:24:05.611460 kernel: audit: type=1130 audit(1707812645.583:371): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.67.79:22-139.178.68.195:35398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:05.722000 audit[5642]: USER_ACCT pid=5642 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.723452 sshd[5642]: Accepted publickey for core from 139.178.68.195 port 35398 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:05.725295 sshd[5642]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:05.727838 systemd-logind[1548]: New session 20 of user core. Feb 13 08:24:05.728278 systemd[1]: Started session-20.scope. Feb 13 08:24:05.806429 sshd[5642]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:05.808057 systemd[1]: sshd@17-145.40.67.79:22-139.178.68.195:35398.service: Deactivated successfully. Feb 13 08:24:05.808697 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 08:24:05.808722 systemd-logind[1548]: Session 20 logged out. Waiting for processes to exit. Feb 13 08:24:05.809397 systemd-logind[1548]: Removed session 20. Feb 13 08:24:05.724000 audit[5642]: CRED_ACQ pid=5642 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.904975 kernel: audit: type=1101 audit(1707812645.722:372): pid=5642 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.905065 kernel: audit: type=1103 audit(1707812645.724:373): pid=5642 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.905081 kernel: audit: type=1006 audit(1707812645.724:374): pid=5642 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Feb 13 08:24:05.963506 kernel: audit: type=1300 audit(1707812645.724:374): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd284cc5e0 a2=3 a3=0 items=0 ppid=1 pid=5642 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:05.724000 audit[5642]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd284cc5e0 a2=3 a3=0 items=0 ppid=1 pid=5642 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:06.055512 kernel: audit: type=1327 audit(1707812645.724:374): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:05.724000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:06.085939 kernel: audit: type=1105 audit(1707812645.730:375): pid=5642 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.730000 audit[5642]: USER_START pid=5642 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:06.180270 kernel: audit: type=1103 audit(1707812645.731:376): pid=5645 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.731000 audit[5645]: CRED_ACQ pid=5645 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:06.269343 kernel: audit: type=1106 audit(1707812645.806:377): pid=5642 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.806000 audit[5642]: USER_END pid=5642 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.806000 audit[5642]: CRED_DISP pid=5642 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:06.453832 kernel: audit: type=1104 audit(1707812645.806:378): pid=5642 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:05.807000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.67.79:22-139.178.68.195:35398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:06.657378 env[1563]: time="2024-02-13T08:24:06.657289046Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:24:06.708075 env[1563]: time="2024-02-13T08:24:06.707961564Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:06.708319 kubelet[2738]: E0213 08:24:06.708262 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:24:06.708319 kubelet[2738]: E0213 08:24:06.708313 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:24:06.708776 kubelet[2738]: E0213 08:24:06.708366 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:06.708776 kubelet[2738]: E0213 08:24:06.708409 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:24:10.658153 env[1563]: time="2024-02-13T08:24:10.658019086Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:24:10.684794 env[1563]: time="2024-02-13T08:24:10.684740966Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:10.684902 kubelet[2738]: E0213 08:24:10.684840 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:24:10.684902 kubelet[2738]: E0213 08:24:10.684883 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:24:10.685124 kubelet[2738]: E0213 08:24:10.684904 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:10.685124 kubelet[2738]: E0213 08:24:10.684923 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:24:10.813776 systemd[1]: Started sshd@18-145.40.67.79:22-139.178.68.195:38808.service. Feb 13 08:24:10.812000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.67.79:22-139.178.68.195:38808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:10.840229 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:24:10.840270 kernel: audit: type=1130 audit(1707812650.812:380): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.67.79:22-139.178.68.195:38808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:10.949000 audit[5727]: USER_ACCT pid=5727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:10.950930 sshd[5727]: Accepted publickey for core from 139.178.68.195 port 38808 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:10.952319 sshd[5727]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:10.954954 systemd-logind[1548]: New session 21 of user core. Feb 13 08:24:10.955434 systemd[1]: Started session-21.scope. Feb 13 08:24:11.034060 sshd[5727]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:11.035695 systemd[1]: sshd@18-145.40.67.79:22-139.178.68.195:38808.service: Deactivated successfully. Feb 13 08:24:11.036344 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 08:24:11.036378 systemd-logind[1548]: Session 21 logged out. Waiting for processes to exit. Feb 13 08:24:11.037074 systemd-logind[1548]: Removed session 21. Feb 13 08:24:10.951000 audit[5727]: CRED_ACQ pid=5727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.044071 kernel: audit: type=1101 audit(1707812650.949:381): pid=5727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.044103 kernel: audit: type=1103 audit(1707812650.951:382): pid=5727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.192023 kernel: audit: type=1006 audit(1707812650.951:383): pid=5727 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Feb 13 08:24:11.192054 kernel: audit: type=1300 audit(1707812650.951:383): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff009c64f0 a2=3 a3=0 items=0 ppid=1 pid=5727 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:10.951000 audit[5727]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff009c64f0 a2=3 a3=0 items=0 ppid=1 pid=5727 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:11.283912 kernel: audit: type=1327 audit(1707812650.951:383): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:10.951000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:11.314337 kernel: audit: type=1105 audit(1707812650.957:384): pid=5727 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:10.957000 audit[5727]: USER_START pid=5727 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.408682 kernel: audit: type=1103 audit(1707812650.958:385): pid=5730 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:10.958000 audit[5730]: CRED_ACQ pid=5730 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.497719 kernel: audit: type=1106 audit(1707812651.033:386): pid=5727 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.033000 audit[5727]: USER_END pid=5727 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.593076 kernel: audit: type=1104 audit(1707812651.034:387): pid=5727 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.034000 audit[5727]: CRED_DISP pid=5727 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:11.656026 env[1563]: time="2024-02-13T08:24:11.656001991Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:24:11.656026 env[1563]: time="2024-02-13T08:24:11.656016446Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:24:11.667711 env[1563]: time="2024-02-13T08:24:11.667645001Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:11.667937 env[1563]: time="2024-02-13T08:24:11.667859189Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:11.667961 kubelet[2738]: E0213 08:24:11.667787 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:24:11.667961 kubelet[2738]: E0213 08:24:11.667817 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:24:11.667961 kubelet[2738]: E0213 08:24:11.667840 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:11.667961 kubelet[2738]: E0213 08:24:11.667862 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:24:11.668089 kubelet[2738]: E0213 08:24:11.667924 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:24:11.668089 kubelet[2738]: E0213 08:24:11.667933 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:24:11.668089 kubelet[2738]: E0213 08:24:11.667951 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:11.668089 kubelet[2738]: E0213 08:24:11.667964 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:24:11.034000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.67.79:22-139.178.68.195:38808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:12.944779 systemd[1]: Started sshd@19-145.40.67.79:22-221.192.237.58:28867.service. Feb 13 08:24:12.943000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.67.79:22-221.192.237.58:28867 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:16.042538 systemd[1]: Started sshd@20-145.40.67.79:22-139.178.68.195:47034.service. Feb 13 08:24:16.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.67.79:22-139.178.68.195:47034 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:16.069918 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:24:16.069978 kernel: audit: type=1130 audit(1707812656.041:390): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.67.79:22-139.178.68.195:47034 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:16.177000 audit[5810]: USER_ACCT pid=5810 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.178864 sshd[5810]: Accepted publickey for core from 139.178.68.195 port 47034 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:16.180293 sshd[5810]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:16.182639 systemd-logind[1548]: New session 22 of user core. Feb 13 08:24:16.183048 systemd[1]: Started session-22.scope. Feb 13 08:24:16.261322 sshd[5810]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:16.262577 systemd[1]: sshd@20-145.40.67.79:22-139.178.68.195:47034.service: Deactivated successfully. Feb 13 08:24:16.263220 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 08:24:16.263263 systemd-logind[1548]: Session 22 logged out. Waiting for processes to exit. Feb 13 08:24:16.263757 systemd-logind[1548]: Removed session 22. Feb 13 08:24:16.179000 audit[5810]: CRED_ACQ pid=5810 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.360392 kernel: audit: type=1101 audit(1707812656.177:391): pid=5810 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.360421 kernel: audit: type=1103 audit(1707812656.179:392): pid=5810 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.360441 kernel: audit: type=1006 audit(1707812656.179:393): pid=5810 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Feb 13 08:24:16.418842 kernel: audit: type=1300 audit(1707812656.179:393): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3648fe50 a2=3 a3=0 items=0 ppid=1 pid=5810 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:16.179000 audit[5810]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3648fe50 a2=3 a3=0 items=0 ppid=1 pid=5810 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:16.510722 kernel: audit: type=1327 audit(1707812656.179:393): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:16.179000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:16.184000 audit[5810]: USER_START pid=5810 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.635448 kernel: audit: type=1105 audit(1707812656.184:394): pid=5810 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.635476 kernel: audit: type=1103 audit(1707812656.184:395): pid=5813 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.184000 audit[5813]: CRED_ACQ pid=5813 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.724448 kernel: audit: type=1106 audit(1707812656.260:396): pid=5810 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.260000 audit[5810]: USER_END pid=5810 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.819800 kernel: audit: type=1104 audit(1707812656.260:397): pid=5810 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.260000 audit[5810]: CRED_DISP pid=5810 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:16.261000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.67.79:22-139.178.68.195:47034 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:18.656730 env[1563]: time="2024-02-13T08:24:18.656693605Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:24:18.672911 env[1563]: time="2024-02-13T08:24:18.672849466Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:18.673106 kubelet[2738]: E0213 08:24:18.673038 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:24:18.673106 kubelet[2738]: E0213 08:24:18.673091 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:24:18.673344 kubelet[2738]: E0213 08:24:18.673112 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:18.673344 kubelet[2738]: E0213 08:24:18.673149 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:24:21.268570 systemd[1]: Started sshd@21-145.40.67.79:22-139.178.68.195:47044.service. Feb 13 08:24:21.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.67.79:22-139.178.68.195:47044 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:21.295472 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:24:21.295567 kernel: audit: type=1130 audit(1707812661.267:399): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.67.79:22-139.178.68.195:47044 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:21.404000 audit[5865]: USER_ACCT pid=5865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.405492 sshd[5865]: Accepted publickey for core from 139.178.68.195 port 47044 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:21.406293 sshd[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:21.408615 systemd-logind[1548]: New session 23 of user core. Feb 13 08:24:21.409102 systemd[1]: Started session-23.scope. Feb 13 08:24:21.487814 sshd[5865]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:21.489147 systemd[1]: sshd@21-145.40.67.79:22-139.178.68.195:47044.service: Deactivated successfully. Feb 13 08:24:21.489762 systemd-logind[1548]: Session 23 logged out. Waiting for processes to exit. Feb 13 08:24:21.489770 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 08:24:21.490287 systemd-logind[1548]: Removed session 23. Feb 13 08:24:21.405000 audit[5865]: CRED_ACQ pid=5865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.586992 kernel: audit: type=1101 audit(1707812661.404:400): pid=5865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.587027 kernel: audit: type=1103 audit(1707812661.405:401): pid=5865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.587047 kernel: audit: type=1006 audit(1707812661.405:402): pid=5865 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Feb 13 08:24:21.645403 kernel: audit: type=1300 audit(1707812661.405:402): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc42032990 a2=3 a3=0 items=0 ppid=1 pid=5865 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:21.405000 audit[5865]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc42032990 a2=3 a3=0 items=0 ppid=1 pid=5865 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:21.737276 kernel: audit: type=1327 audit(1707812661.405:402): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:21.405000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:21.767669 kernel: audit: type=1105 audit(1707812661.410:403): pid=5865 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.410000 audit[5865]: USER_START pid=5865 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.861984 kernel: audit: type=1103 audit(1707812661.411:404): pid=5868 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.411000 audit[5868]: CRED_ACQ pid=5868 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.950981 kernel: audit: type=1106 audit(1707812661.487:405): pid=5865 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.487000 audit[5865]: USER_END pid=5865 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:22.046498 kernel: audit: type=1104 audit(1707812661.487:406): pid=5865 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.487000 audit[5865]: CRED_DISP pid=5865 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:21.488000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.67.79:22-139.178.68.195:47044 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:25.656438 env[1563]: time="2024-02-13T08:24:25.656378765Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:24:25.672533 env[1563]: time="2024-02-13T08:24:25.672471703Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:25.672677 kubelet[2738]: E0213 08:24:25.672637 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:24:25.672677 kubelet[2738]: E0213 08:24:25.672663 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:24:25.672880 kubelet[2738]: E0213 08:24:25.672687 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:25.672880 kubelet[2738]: E0213 08:24:25.672704 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:24:26.493506 systemd[1]: Started sshd@22-145.40.67.79:22-139.178.68.195:57630.service. Feb 13 08:24:26.492000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.67.79:22-139.178.68.195:57630 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:26.520319 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:24:26.520388 kernel: audit: type=1130 audit(1707812666.492:408): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.67.79:22-139.178.68.195:57630 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:26.630181 sshd[5921]: Accepted publickey for core from 139.178.68.195 port 57630 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:26.629000 audit[5921]: USER_ACCT pid=5921 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.631325 sshd[5921]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:26.633687 systemd-logind[1548]: New session 24 of user core. Feb 13 08:24:26.634158 systemd[1]: Started session-24.scope. Feb 13 08:24:26.656541 env[1563]: time="2024-02-13T08:24:26.656522206Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:24:26.656884 env[1563]: time="2024-02-13T08:24:26.656551226Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:24:26.669123 env[1563]: time="2024-02-13T08:24:26.669082758Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:26.669245 env[1563]: time="2024-02-13T08:24:26.669088653Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:26.669329 kubelet[2738]: E0213 08:24:26.669288 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:24:26.669329 kubelet[2738]: E0213 08:24:26.669316 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:24:26.669435 kubelet[2738]: E0213 08:24:26.669337 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:26.669435 kubelet[2738]: E0213 08:24:26.669353 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:24:26.669435 kubelet[2738]: E0213 08:24:26.669289 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:24:26.669435 kubelet[2738]: E0213 08:24:26.669370 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:24:26.669657 kubelet[2738]: E0213 08:24:26.669388 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:26.669657 kubelet[2738]: E0213 08:24:26.669403 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:24:26.714978 sshd[5921]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:26.716394 systemd[1]: sshd@22-145.40.67.79:22-139.178.68.195:57630.service: Deactivated successfully. Feb 13 08:24:26.716922 systemd-logind[1548]: Session 24 logged out. Waiting for processes to exit. Feb 13 08:24:26.716959 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 08:24:26.717524 systemd-logind[1548]: Removed session 24. Feb 13 08:24:26.630000 audit[5921]: CRED_ACQ pid=5921 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.813077 kernel: audit: type=1101 audit(1707812666.629:409): pid=5921 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.813164 kernel: audit: type=1103 audit(1707812666.630:410): pid=5921 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.813205 kernel: audit: type=1006 audit(1707812666.630:411): pid=5921 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Feb 13 08:24:26.630000 audit[5921]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe7be49e0 a2=3 a3=0 items=0 ppid=1 pid=5921 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:26.963362 kernel: audit: type=1300 audit(1707812666.630:411): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe7be49e0 a2=3 a3=0 items=0 ppid=1 pid=5921 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:26.963409 kernel: audit: type=1327 audit(1707812666.630:411): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:26.630000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:26.635000 audit[5921]: USER_START pid=5921 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:27.088031 kernel: audit: type=1105 audit(1707812666.635:412): pid=5921 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:27.088086 kernel: audit: type=1103 audit(1707812666.635:413): pid=5924 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.635000 audit[5924]: CRED_ACQ pid=5924 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.714000 audit[5921]: USER_END pid=5921 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:27.272462 kernel: audit: type=1106 audit(1707812666.714:414): pid=5921 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:27.272510 kernel: audit: type=1104 audit(1707812666.714:415): pid=5921 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.714000 audit[5921]: CRED_DISP pid=5921 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:26.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.67.79:22-139.178.68.195:57630 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:27.888487 sshd[5809]: kex_exchange_identification: banner line contains invalid characters Feb 13 08:24:27.889221 sshd[5809]: banner exchange: Connection from 221.192.237.58 port 28867: invalid format Feb 13 08:24:27.889980 systemd[1]: sshd@19-145.40.67.79:22-221.192.237.58:28867.service: Deactivated successfully. Feb 13 08:24:27.889000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.67.79:22-221.192.237.58:28867 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:31.655772 env[1563]: time="2024-02-13T08:24:31.655713447Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:24:31.670895 env[1563]: time="2024-02-13T08:24:31.670851797Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:31.671064 kubelet[2738]: E0213 08:24:31.671051 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:24:31.671282 kubelet[2738]: E0213 08:24:31.671080 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:24:31.671282 kubelet[2738]: E0213 08:24:31.671109 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:31.671282 kubelet[2738]: E0213 08:24:31.671131 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:24:31.722608 systemd[1]: Started sshd@23-145.40.67.79:22-139.178.68.195:57644.service. Feb 13 08:24:31.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.67.79:22-139.178.68.195:57644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:31.749530 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:24:31.749583 kernel: audit: type=1130 audit(1707812671.722:418): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.67.79:22-139.178.68.195:57644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:31.857000 audit[6041]: USER_ACCT pid=6041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.858429 sshd[6041]: Accepted publickey for core from 139.178.68.195 port 57644 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:31.859265 sshd[6041]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:31.861653 systemd-logind[1548]: New session 25 of user core. Feb 13 08:24:31.862085 systemd[1]: Started session-25.scope. Feb 13 08:24:31.947746 sshd[6041]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:31.949223 systemd[1]: sshd@23-145.40.67.79:22-139.178.68.195:57644.service: Deactivated successfully. Feb 13 08:24:31.949834 systemd-logind[1548]: Session 25 logged out. Waiting for processes to exit. Feb 13 08:24:31.949872 systemd[1]: session-25.scope: Deactivated successfully. Feb 13 08:24:31.858000 audit[6041]: CRED_ACQ pid=6041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.950454 systemd-logind[1548]: Removed session 25. Feb 13 08:24:32.040622 kernel: audit: type=1101 audit(1707812671.857:419): pid=6041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:32.040715 kernel: audit: type=1103 audit(1707812671.858:420): pid=6041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:32.040744 kernel: audit: type=1006 audit(1707812671.858:421): pid=6041 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Feb 13 08:24:31.858000 audit[6041]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6fbcc5d0 a2=3 a3=0 items=0 ppid=1 pid=6041 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:32.190954 kernel: audit: type=1300 audit(1707812671.858:421): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6fbcc5d0 a2=3 a3=0 items=0 ppid=1 pid=6041 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:32.190988 kernel: audit: type=1327 audit(1707812671.858:421): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:31.858000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:32.221349 kernel: audit: type=1105 audit(1707812671.863:422): pid=6041 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.863000 audit[6041]: USER_START pid=6041 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:32.315827 kernel: audit: type=1103 audit(1707812671.863:423): pid=6044 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.863000 audit[6044]: CRED_ACQ pid=6044 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.947000 audit[6041]: USER_END pid=6041 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:32.500238 kernel: audit: type=1106 audit(1707812671.947:424): pid=6041 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:32.500294 kernel: audit: type=1104 audit(1707812671.947:425): pid=6041 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.947000 audit[6041]: CRED_DISP pid=6041 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:31.948000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.67.79:22-139.178.68.195:57644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:36.657201 env[1563]: time="2024-02-13T08:24:36.657041301Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:24:36.696501 env[1563]: time="2024-02-13T08:24:36.696410891Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:36.696672 kubelet[2738]: E0213 08:24:36.696647 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:24:36.697069 kubelet[2738]: E0213 08:24:36.696695 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:24:36.697069 kubelet[2738]: E0213 08:24:36.696744 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:36.697069 kubelet[2738]: E0213 08:24:36.696786 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:24:36.954666 systemd[1]: Started sshd@24-145.40.67.79:22-139.178.68.195:46060.service. Feb 13 08:24:36.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.67.79:22-139.178.68.195:46060 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:36.996583 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:24:36.996658 kernel: audit: type=1130 audit(1707812676.954:427): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.67.79:22-139.178.68.195:46060 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:37.103000 audit[6100]: USER_ACCT pid=6100 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.105031 sshd[6100]: Accepted publickey for core from 139.178.68.195 port 46060 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:37.106308 sshd[6100]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:37.108690 systemd-logind[1548]: New session 26 of user core. Feb 13 08:24:37.109221 systemd[1]: Started session-26.scope. Feb 13 08:24:37.186134 sshd[6100]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:37.187445 systemd[1]: sshd@24-145.40.67.79:22-139.178.68.195:46060.service: Deactivated successfully. Feb 13 08:24:37.187977 systemd-logind[1548]: Session 26 logged out. Waiting for processes to exit. Feb 13 08:24:37.188019 systemd[1]: session-26.scope: Deactivated successfully. Feb 13 08:24:37.188580 systemd-logind[1548]: Removed session 26. Feb 13 08:24:37.105000 audit[6100]: CRED_ACQ pid=6100 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.286529 kernel: audit: type=1101 audit(1707812677.103:428): pid=6100 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.286561 kernel: audit: type=1103 audit(1707812677.105:429): pid=6100 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.286574 kernel: audit: type=1006 audit(1707812677.105:430): pid=6100 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Feb 13 08:24:37.345167 kernel: audit: type=1300 audit(1707812677.105:430): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde9305aa0 a2=3 a3=0 items=0 ppid=1 pid=6100 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:37.105000 audit[6100]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde9305aa0 a2=3 a3=0 items=0 ppid=1 pid=6100 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:37.437028 kernel: audit: type=1327 audit(1707812677.105:430): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:37.105000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:37.467417 kernel: audit: type=1105 audit(1707812677.110:431): pid=6100 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.110000 audit[6100]: USER_START pid=6100 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.110000 audit[6103]: CRED_ACQ pid=6103 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.650803 kernel: audit: type=1103 audit(1707812677.110:432): pid=6103 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.650838 kernel: audit: type=1106 audit(1707812677.185:433): pid=6100 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.185000 audit[6100]: USER_END pid=6100 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.185000 audit[6100]: CRED_DISP pid=6100 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.835362 kernel: audit: type=1104 audit(1707812677.185:434): pid=6100 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:37.186000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.67.79:22-139.178.68.195:46060 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:39.656524 env[1563]: time="2024-02-13T08:24:39.656492482Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:24:39.671527 env[1563]: time="2024-02-13T08:24:39.671496200Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:39.671679 kubelet[2738]: E0213 08:24:39.671667 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:24:39.671846 kubelet[2738]: E0213 08:24:39.671693 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:24:39.671846 kubelet[2738]: E0213 08:24:39.671716 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:39.671846 kubelet[2738]: E0213 08:24:39.671734 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:24:41.656798 env[1563]: time="2024-02-13T08:24:41.656764266Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:24:41.672854 env[1563]: time="2024-02-13T08:24:41.672795053Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:41.672943 kubelet[2738]: E0213 08:24:41.672931 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:24:41.673118 kubelet[2738]: E0213 08:24:41.672957 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:24:41.673118 kubelet[2738]: E0213 08:24:41.672979 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:41.673118 kubelet[2738]: E0213 08:24:41.673004 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:24:42.193471 systemd[1]: Started sshd@25-145.40.67.79:22-139.178.68.195:46070.service. Feb 13 08:24:42.192000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.67.79:22-139.178.68.195:46070 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:42.220818 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:24:42.220896 kernel: audit: type=1130 audit(1707812682.192:436): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.67.79:22-139.178.68.195:46070 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:42.329000 audit[6184]: USER_ACCT pid=6184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.330441 sshd[6184]: Accepted publickey for core from 139.178.68.195 port 46070 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:42.333261 sshd[6184]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:42.335482 systemd-logind[1548]: New session 27 of user core. Feb 13 08:24:42.336100 systemd[1]: Started session-27.scope. Feb 13 08:24:42.416104 sshd[6184]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:42.417412 systemd[1]: sshd@25-145.40.67.79:22-139.178.68.195:46070.service: Deactivated successfully. Feb 13 08:24:42.417948 systemd-logind[1548]: Session 27 logged out. Waiting for processes to exit. Feb 13 08:24:42.417996 systemd[1]: session-27.scope: Deactivated successfully. Feb 13 08:24:42.418634 systemd-logind[1548]: Removed session 27. Feb 13 08:24:42.332000 audit[6184]: CRED_ACQ pid=6184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.513642 kernel: audit: type=1101 audit(1707812682.329:437): pid=6184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.513677 kernel: audit: type=1103 audit(1707812682.332:438): pid=6184 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.513693 kernel: audit: type=1006 audit(1707812682.332:439): pid=6184 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Feb 13 08:24:42.572123 kernel: audit: type=1300 audit(1707812682.332:439): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc32b61c30 a2=3 a3=0 items=0 ppid=1 pid=6184 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:42.332000 audit[6184]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc32b61c30 a2=3 a3=0 items=0 ppid=1 pid=6184 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:42.332000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:42.694463 kernel: audit: type=1327 audit(1707812682.332:439): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:42.694499 kernel: audit: type=1105 audit(1707812682.337:440): pid=6184 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.337000 audit[6184]: USER_START pid=6184 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.788816 kernel: audit: type=1103 audit(1707812682.337:441): pid=6187 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.337000 audit[6187]: CRED_ACQ pid=6187 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.415000 audit[6184]: USER_END pid=6184 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.877999 kernel: audit: type=1106 audit(1707812682.415:442): pid=6184 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.415000 audit[6184]: CRED_DISP pid=6184 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:43.062522 kernel: audit: type=1104 audit(1707812682.415:443): pid=6184 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:42.416000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.67.79:22-139.178.68.195:46070 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:43.451765 systemd[1]: Started sshd@26-145.40.67.79:22-221.192.237.58:36022.service. Feb 13 08:24:43.451000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.67.79:22-221.192.237.58:36022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:44.657259 env[1563]: time="2024-02-13T08:24:44.657171049Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:24:44.705741 env[1563]: time="2024-02-13T08:24:44.705682613Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:44.705946 kubelet[2738]: E0213 08:24:44.705925 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:24:44.706352 kubelet[2738]: E0213 08:24:44.705974 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:24:44.706352 kubelet[2738]: E0213 08:24:44.706050 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:44.706352 kubelet[2738]: E0213 08:24:44.706106 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:24:47.419171 systemd[1]: Started sshd@27-145.40.67.79:22-139.178.68.195:49290.service. Feb 13 08:24:47.418000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.67.79:22-139.178.68.195:49290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:47.446367 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:24:47.446446 kernel: audit: type=1130 audit(1707812687.418:446): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.67.79:22-139.178.68.195:49290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:47.555000 audit[6241]: USER_ACCT pid=6241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.556848 sshd[6241]: Accepted publickey for core from 139.178.68.195 port 49290 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:47.560713 sshd[6241]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:47.564479 systemd-logind[1548]: New session 28 of user core. Feb 13 08:24:47.564902 systemd[1]: Started session-28.scope. Feb 13 08:24:47.645132 sshd[6241]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:47.646570 systemd[1]: sshd@27-145.40.67.79:22-139.178.68.195:49290.service: Deactivated successfully. Feb 13 08:24:47.647188 systemd[1]: session-28.scope: Deactivated successfully. Feb 13 08:24:47.647242 systemd-logind[1548]: Session 28 logged out. Waiting for processes to exit. Feb 13 08:24:47.647701 systemd-logind[1548]: Removed session 28. Feb 13 08:24:47.559000 audit[6241]: CRED_ACQ pid=6241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.738300 kernel: audit: type=1101 audit(1707812687.555:447): pid=6241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.738341 kernel: audit: type=1103 audit(1707812687.559:448): pid=6241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.738356 kernel: audit: type=1006 audit(1707812687.559:449): pid=6241 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Feb 13 08:24:47.796770 kernel: audit: type=1300 audit(1707812687.559:449): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff469a0bc0 a2=3 a3=0 items=0 ppid=1 pid=6241 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:47.559000 audit[6241]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff469a0bc0 a2=3 a3=0 items=0 ppid=1 pid=6241 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:47.888647 kernel: audit: type=1327 audit(1707812687.559:449): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:47.559000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:47.919077 kernel: audit: type=1105 audit(1707812687.566:450): pid=6241 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.566000 audit[6241]: USER_START pid=6241 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.929784 sshd[6210]: Invalid user wqmarlduiqkmgs from 221.192.237.58 port 36022 Feb 13 08:24:47.930717 sshd[6210]: userauth_pubkey: parse publickey packet: incomplete message [preauth] Feb 13 08:24:47.931061 systemd[1]: sshd@26-145.40.67.79:22-221.192.237.58:36022.service: Deactivated successfully. Feb 13 08:24:47.566000 audit[6244]: CRED_ACQ pid=6244 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:48.102446 kernel: audit: type=1103 audit(1707812687.566:451): pid=6244 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:48.102482 kernel: audit: type=1106 audit(1707812687.644:452): pid=6241 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.644000 audit[6241]: USER_END pid=6241 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:48.197898 kernel: audit: type=1104 audit(1707812687.644:453): pid=6241 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.644000 audit[6241]: CRED_DISP pid=6241 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:47.645000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.67.79:22-139.178.68.195:49290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:47.929000 audit[6210]: USER_ERR pid=6210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/sbin/sshd" hostname=221.192.237.58 addr=221.192.237.58 terminal=ssh res=failed' Feb 13 08:24:47.930000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.67.79:22-221.192.237.58:36022 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:49.656929 env[1563]: time="2024-02-13T08:24:49.656844452Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:24:49.679092 env[1563]: time="2024-02-13T08:24:49.679029203Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:49.679237 kubelet[2738]: E0213 08:24:49.679195 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:24:49.679237 kubelet[2738]: E0213 08:24:49.679220 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:24:49.679444 kubelet[2738]: E0213 08:24:49.679243 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:49.679444 kubelet[2738]: E0213 08:24:49.679270 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:24:52.651669 systemd[1]: Started sshd@28-145.40.67.79:22-139.178.68.195:49302.service. Feb 13 08:24:52.650000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.67.79:22-139.178.68.195:49302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:52.656005 env[1563]: time="2024-02-13T08:24:52.655976660Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:24:52.669001 env[1563]: time="2024-02-13T08:24:52.668961309Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:52.669240 kubelet[2738]: E0213 08:24:52.669176 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:24:52.669240 kubelet[2738]: E0213 08:24:52.669221 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:24:52.669430 kubelet[2738]: E0213 08:24:52.669244 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:52.669430 kubelet[2738]: E0213 08:24:52.669262 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:24:52.678756 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:24:52.678803 kernel: audit: type=1130 audit(1707812692.650:457): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.67.79:22-139.178.68.195:49302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:52.786000 audit[6295]: USER_ACCT pid=6295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.787597 sshd[6295]: Accepted publickey for core from 139.178.68.195 port 49302 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:52.789315 sshd[6295]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:52.791710 systemd-logind[1548]: New session 29 of user core. Feb 13 08:24:52.792192 systemd[1]: Started session-29.scope. Feb 13 08:24:52.871180 sshd[6295]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:52.872656 systemd[1]: sshd@28-145.40.67.79:22-139.178.68.195:49302.service: Deactivated successfully. Feb 13 08:24:52.873256 systemd-logind[1548]: Session 29 logged out. Waiting for processes to exit. Feb 13 08:24:52.873261 systemd[1]: session-29.scope: Deactivated successfully. Feb 13 08:24:52.873708 systemd-logind[1548]: Removed session 29. Feb 13 08:24:52.788000 audit[6295]: CRED_ACQ pid=6295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.969148 kernel: audit: type=1101 audit(1707812692.786:458): pid=6295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.969183 kernel: audit: type=1103 audit(1707812692.788:459): pid=6295 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.969198 kernel: audit: type=1006 audit(1707812692.788:460): pid=6295 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Feb 13 08:24:53.027607 kernel: audit: type=1300 audit(1707812692.788:460): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0ecf2360 a2=3 a3=0 items=0 ppid=1 pid=6295 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:52.788000 audit[6295]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0ecf2360 a2=3 a3=0 items=0 ppid=1 pid=6295 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:53.119514 kernel: audit: type=1327 audit(1707812692.788:460): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:52.788000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:53.149891 kernel: audit: type=1105 audit(1707812692.793:461): pid=6295 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.793000 audit[6295]: USER_START pid=6295 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:53.244217 kernel: audit: type=1103 audit(1707812692.793:462): pid=6328 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.793000 audit[6328]: CRED_ACQ pid=6328 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:53.333264 kernel: audit: type=1106 audit(1707812692.870:463): pid=6295 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.870000 audit[6295]: USER_END pid=6295 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:53.428730 kernel: audit: type=1104 audit(1707812692.871:464): pid=6295 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.871000 audit[6295]: CRED_DISP pid=6295 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:52.871000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.67.79:22-139.178.68.195:49302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:53.657243 env[1563]: time="2024-02-13T08:24:53.657058026Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:24:53.678500 env[1563]: time="2024-02-13T08:24:53.678463954Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:53.678668 kubelet[2738]: E0213 08:24:53.678635 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:24:53.678668 kubelet[2738]: E0213 08:24:53.678661 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:24:53.678868 kubelet[2738]: E0213 08:24:53.678685 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:53.678868 kubelet[2738]: E0213 08:24:53.678706 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:24:57.655978 env[1563]: time="2024-02-13T08:24:57.655940046Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:24:57.678689 env[1563]: time="2024-02-13T08:24:57.678596109Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:24:57.678884 kubelet[2738]: E0213 08:24:57.678858 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:24:57.679315 kubelet[2738]: E0213 08:24:57.678918 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:24:57.679315 kubelet[2738]: E0213 08:24:57.679004 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:24:57.679315 kubelet[2738]: E0213 08:24:57.679075 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:24:57.878836 systemd[1]: Started sshd@29-145.40.67.79:22-139.178.68.195:44260.service. Feb 13 08:24:57.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.67.79:22-139.178.68.195:44260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:57.905685 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:24:57.905728 kernel: audit: type=1130 audit(1707812697.877:466): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.67.79:22-139.178.68.195:44260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:24:58.015000 audit[6412]: USER_ACCT pid=6412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.016824 sshd[6412]: Accepted publickey for core from 139.178.68.195 port 44260 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:24:58.021252 sshd[6412]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:24:58.031153 systemd-logind[1548]: New session 30 of user core. Feb 13 08:24:58.033198 systemd[1]: Started session-30.scope. Feb 13 08:24:58.019000 audit[6412]: CRED_ACQ pid=6412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.116878 sshd[6412]: pam_unix(sshd:session): session closed for user core Feb 13 08:24:58.118411 systemd[1]: sshd@29-145.40.67.79:22-139.178.68.195:44260.service: Deactivated successfully. Feb 13 08:24:58.118934 systemd-logind[1548]: Session 30 logged out. Waiting for processes to exit. Feb 13 08:24:58.118970 systemd[1]: session-30.scope: Deactivated successfully. Feb 13 08:24:58.119523 systemd-logind[1548]: Removed session 30. Feb 13 08:24:58.200155 kernel: audit: type=1101 audit(1707812698.015:467): pid=6412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.200192 kernel: audit: type=1103 audit(1707812698.019:468): pid=6412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.200207 kernel: audit: type=1006 audit(1707812698.019:469): pid=6412 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Feb 13 08:24:58.258603 kernel: audit: type=1300 audit(1707812698.019:469): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe073c18e0 a2=3 a3=0 items=0 ppid=1 pid=6412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:58.019000 audit[6412]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe073c18e0 a2=3 a3=0 items=0 ppid=1 pid=6412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:24:58.350522 kernel: audit: type=1327 audit(1707812698.019:469): proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:58.019000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:24:58.380922 kernel: audit: type=1105 audit(1707812698.038:470): pid=6412 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.038000 audit[6412]: USER_START pid=6412 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.475265 kernel: audit: type=1103 audit(1707812698.039:471): pid=6415 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.039000 audit[6415]: CRED_ACQ pid=6415 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.564484 kernel: audit: type=1106 audit(1707812698.116:472): pid=6412 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.116000 audit[6412]: USER_END pid=6412 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.659856 kernel: audit: type=1104 audit(1707812698.116:473): pid=6412 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.116000 audit[6412]: CRED_DISP pid=6412 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:24:58.117000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.67.79:22-139.178.68.195:44260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:03.120476 systemd[1]: Started sshd@30-145.40.67.79:22-139.178.68.195:44262.service. Feb 13 08:25:03.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.67.79:22-139.178.68.195:44262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:03.147533 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:03.147608 kernel: audit: type=1130 audit(1707812703.119:475): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.67.79:22-139.178.68.195:44262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:03.256000 audit[6439]: USER_ACCT pid=6439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.258292 sshd[6439]: Accepted publickey for core from 139.178.68.195 port 44262 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:03.261379 sshd[6439]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:03.271055 systemd-logind[1548]: New session 31 of user core. Feb 13 08:25:03.272644 systemd[1]: Started session-31.scope. Feb 13 08:25:03.259000 audit[6439]: CRED_ACQ pid=6439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.353487 sshd[6439]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:03.354706 systemd[1]: sshd@30-145.40.67.79:22-139.178.68.195:44262.service: Deactivated successfully. Feb 13 08:25:03.355357 systemd[1]: session-31.scope: Deactivated successfully. Feb 13 08:25:03.355399 systemd-logind[1548]: Session 31 logged out. Waiting for processes to exit. Feb 13 08:25:03.355869 systemd-logind[1548]: Removed session 31. Feb 13 08:25:03.439466 kernel: audit: type=1101 audit(1707812703.256:476): pid=6439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.439577 kernel: audit: type=1103 audit(1707812703.259:477): pid=6439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.439610 kernel: audit: type=1006 audit(1707812703.259:478): pid=6439 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Feb 13 08:25:03.259000 audit[6439]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc04996540 a2=3 a3=0 items=0 ppid=1 pid=6439 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:03.589847 kernel: audit: type=1300 audit(1707812703.259:478): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc04996540 a2=3 a3=0 items=0 ppid=1 pid=6439 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:03.259000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:03.589999 kernel: audit: type=1327 audit(1707812703.259:478): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:03.273000 audit[6439]: USER_START pid=6439 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.655913 env[1563]: time="2024-02-13T08:25:03.655889666Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:25:03.674851 env[1563]: time="2024-02-13T08:25:03.674801038Z" level=error msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" failed" error="failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:25:03.675050 kubelet[2738]: E0213 08:25:03.675000 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:03.675050 kubelet[2738]: E0213 08:25:03.675024 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9} Feb 13 08:25:03.675050 kubelet[2738]: E0213 08:25:03.675046 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:25:03.675277 kubelet[2738]: E0213 08:25:03.675063 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d5a107e-32fc-46ef-9ba6-381664363494\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-vhpxp" podUID=7d5a107e-32fc-46ef-9ba6-381664363494 Feb 13 08:25:03.714727 kernel: audit: type=1105 audit(1707812703.273:479): pid=6439 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.714761 kernel: audit: type=1103 audit(1707812703.274:480): pid=6442 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.274000 audit[6442]: CRED_ACQ pid=6442 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.353000 audit[6439]: USER_END pid=6439 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.899330 kernel: audit: type=1106 audit(1707812703.353:481): pid=6439 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.899375 kernel: audit: type=1104 audit(1707812703.353:482): pid=6439 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.353000 audit[6439]: CRED_DISP pid=6439 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:03.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.67.79:22-139.178.68.195:44262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:04.656789 env[1563]: time="2024-02-13T08:25:04.656697769Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:25:04.657823 env[1563]: time="2024-02-13T08:25:04.656858753Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:25:04.709312 env[1563]: time="2024-02-13T08:25:04.709260278Z" level=error msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" failed" error="failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:25:04.709441 env[1563]: time="2024-02-13T08:25:04.709343147Z" level=error msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" failed" error="failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:25:04.709488 kubelet[2738]: E0213 08:25:04.709461 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:04.709488 kubelet[2738]: E0213 08:25:04.709467 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:04.709756 kubelet[2738]: E0213 08:25:04.709495 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb} Feb 13 08:25:04.709756 kubelet[2738]: E0213 08:25:04.709502 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f} Feb 13 08:25:04.709756 kubelet[2738]: E0213 08:25:04.709526 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:25:04.709756 kubelet[2738]: E0213 08:25:04.709552 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podUID=d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d Feb 13 08:25:04.709756 kubelet[2738]: E0213 08:25:04.709551 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:25:04.709974 kubelet[2738]: E0213 08:25:04.709584 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8100728f-8434-43ba-8770-5d3f00e1f18f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mrqvb" podUID=8100728f-8434-43ba-8770-5d3f00e1f18f Feb 13 08:25:08.360625 systemd[1]: Started sshd@31-145.40.67.79:22-139.178.68.195:48360.service. Feb 13 08:25:08.359000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.67.79:22-139.178.68.195:48360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:08.387486 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:08.387523 kernel: audit: type=1130 audit(1707812708.359:484): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.67.79:22-139.178.68.195:48360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:08.497000 audit[6550]: USER_ACCT pid=6550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.498806 sshd[6550]: Accepted publickey for core from 139.178.68.195 port 48360 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:08.502140 sshd[6550]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:08.507702 systemd-logind[1548]: New session 32 of user core. Feb 13 08:25:08.508126 systemd[1]: Started session-32.scope. Feb 13 08:25:08.588969 sshd[6550]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:08.500000 audit[6550]: CRED_ACQ pid=6550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.590466 systemd[1]: sshd@31-145.40.67.79:22-139.178.68.195:48360.service: Deactivated successfully. Feb 13 08:25:08.591089 systemd-logind[1548]: Session 32 logged out. Waiting for processes to exit. Feb 13 08:25:08.591112 systemd[1]: session-32.scope: Deactivated successfully. Feb 13 08:25:08.591601 systemd-logind[1548]: Removed session 32. Feb 13 08:25:08.680093 kernel: audit: type=1101 audit(1707812708.497:485): pid=6550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.680144 kernel: audit: type=1103 audit(1707812708.500:486): pid=6550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.680160 kernel: audit: type=1006 audit(1707812708.500:487): pid=6550 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Feb 13 08:25:08.738556 kernel: audit: type=1300 audit(1707812708.500:487): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff57aa06e0 a2=3 a3=0 items=0 ppid=1 pid=6550 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:08.500000 audit[6550]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff57aa06e0 a2=3 a3=0 items=0 ppid=1 pid=6550 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:08.830474 kernel: audit: type=1327 audit(1707812708.500:487): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:08.500000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:08.860885 kernel: audit: type=1105 audit(1707812708.509:488): pid=6550 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.509000 audit[6550]: USER_START pid=6550 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.955278 kernel: audit: type=1103 audit(1707812708.510:489): pid=6553 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.510000 audit[6553]: CRED_ACQ pid=6553 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:09.044457 kernel: audit: type=1106 audit(1707812708.588:490): pid=6550 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.588000 audit[6550]: USER_END pid=6550 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:09.139911 kernel: audit: type=1104 audit(1707812708.588:491): pid=6550 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.588000 audit[6550]: CRED_DISP pid=6550 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:08.589000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.67.79:22-139.178.68.195:48360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:10.656627 env[1563]: time="2024-02-13T08:25:10.656562979Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:25:10.673485 env[1563]: time="2024-02-13T08:25:10.673425792Z" level=error msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" failed" error="failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:25:10.673575 kubelet[2738]: E0213 08:25:10.673559 2738 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:10.673751 kubelet[2738]: E0213 08:25:10.673584 2738 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41} Feb 13 08:25:10.673751 kubelet[2738]: E0213 08:25:10.673607 2738 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:25:10.673751 kubelet[2738]: E0213 08:25:10.673625 2738 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-tr9dh" podUID=26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 Feb 13 08:25:12.890808 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount385876908.mount: Deactivated successfully. Feb 13 08:25:12.912712 env[1563]: time="2024-02-13T08:25:12.912689902Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:12.913195 env[1563]: time="2024-02-13T08:25:12.913158170Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:1843802b91be8ff1c1d35ee08461ebe909e7a2199e59396f69886439a372312c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:12.913753 env[1563]: time="2024-02-13T08:25:12.913739428Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:12.914674 env[1563]: time="2024-02-13T08:25:12.914661208Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node@sha256:a45dffb21a0e9ca8962f36359a2ab776beeecd93843543c2fa1745d7bbb0f754,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:12.914837 env[1563]: time="2024-02-13T08:25:12.914824716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.27.0\" returns image reference \"sha256:1843802b91be8ff1c1d35ee08461ebe909e7a2199e59396f69886439a372312c\"" Feb 13 08:25:12.918928 env[1563]: time="2024-02-13T08:25:12.918909213Z" level=info msg="CreateContainer within sandbox \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 08:25:12.924422 env[1563]: time="2024-02-13T08:25:12.924351028Z" level=info msg="CreateContainer within sandbox \"036d61154d43a349c70212d1e6e20a8f4614cff752061053175834c1c614a9a5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"d8fd0d16ae5cdd33114b3d76b4d831f136337c8c68a12fd53e7f2249b80dd225\"" Feb 13 08:25:12.924775 env[1563]: time="2024-02-13T08:25:12.924718700Z" level=info msg="StartContainer for \"d8fd0d16ae5cdd33114b3d76b4d831f136337c8c68a12fd53e7f2249b80dd225\"" Feb 13 08:25:12.948660 env[1563]: time="2024-02-13T08:25:12.948631375Z" level=info msg="StartContainer for \"d8fd0d16ae5cdd33114b3d76b4d831f136337c8c68a12fd53e7f2249b80dd225\" returns successfully" Feb 13 08:25:13.056572 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 08:25:13.056628 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 08:25:13.596455 systemd[1]: Started sshd@32-145.40.67.79:22-139.178.68.195:48362.service. Feb 13 08:25:13.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.67.79:22-139.178.68.195:48362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:13.623046 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:13.623142 kernel: audit: type=1130 audit(1707812713.595:493): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.67.79:22-139.178.68.195:48362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:13.732000 audit[6690]: USER_ACCT pid=6690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.733299 sshd[6690]: Accepted publickey for core from 139.178.68.195 port 48362 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:13.735927 sshd[6690]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:13.738438 systemd-logind[1548]: New session 33 of user core. Feb 13 08:25:13.738923 systemd[1]: Started session-33.scope. Feb 13 08:25:13.817875 sshd[6690]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:13.819361 systemd[1]: sshd@32-145.40.67.79:22-139.178.68.195:48362.service: Deactivated successfully. Feb 13 08:25:13.819940 systemd-logind[1548]: Session 33 logged out. Waiting for processes to exit. Feb 13 08:25:13.820016 systemd[1]: session-33.scope: Deactivated successfully. Feb 13 08:25:13.820540 systemd-logind[1548]: Removed session 33. Feb 13 08:25:13.734000 audit[6690]: CRED_ACQ pid=6690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.827464 kubelet[2738]: I0213 08:25:13.827415 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-cph82" podStartSLOduration=-9.223371602027384e+09 pod.CreationTimestamp="2024-02-13 08:17:59 +0000 UTC" firstStartedPulling="2024-02-13 08:17:59.979917425 +0000 UTC m=+19.506826523" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:25:13.826837835 +0000 UTC m=+453.353746902" watchObservedRunningTime="2024-02-13 08:25:13.827392386 +0000 UTC m=+453.354301449" Feb 13 08:25:13.914364 kernel: audit: type=1101 audit(1707812713.732:494): pid=6690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.914423 kernel: audit: type=1103 audit(1707812713.734:495): pid=6690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.914443 kernel: audit: type=1006 audit(1707812713.734:496): pid=6690 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Feb 13 08:25:13.972490 kernel: audit: type=1300 audit(1707812713.734:496): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff2b344790 a2=3 a3=0 items=0 ppid=1 pid=6690 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:13.734000 audit[6690]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff2b344790 a2=3 a3=0 items=0 ppid=1 pid=6690 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.063866 kernel: audit: type=1327 audit(1707812713.734:496): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:13.734000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:14.094073 kernel: audit: type=1105 audit(1707812713.740:497): pid=6690 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.740000 audit[6690]: USER_START pid=6690 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.740000 audit[6693]: CRED_ACQ pid=6693 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:14.277330 kernel: audit: type=1103 audit(1707812713.740:498): pid=6693 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:14.277375 kernel: audit: type=1106 audit(1707812713.817:499): pid=6690 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.817000 audit[6690]: USER_END pid=6690 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:14.372249 kernel: audit: type=1104 audit(1707812713.817:500): pid=6690 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.817000 audit[6690]: CRED_DISP pid=6690 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:13.818000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.67.79:22-139.178.68.195:48362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:14.174000 audit[6782]: AVC avc: denied { write } for pid=6782 comm="tee" name="fd" dev="proc" ino=43873 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.174000 audit[6782]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7fffa3de596d a2=241 a3=1b6 items=1 ppid=6751 pid=6782 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.174000 audit: CWD cwd="/etc/service/enabled/confd/log" Feb 13 08:25:14.174000 audit: PATH item=0 name="/dev/fd/63" inode=45796 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.174000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.174000 audit[6785]: AVC avc: denied { write } for pid=6785 comm="tee" name="fd" dev="proc" ino=51723 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.174000 audit[6785]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc3459695d a2=241 a3=1b6 items=1 ppid=6755 pid=6785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.174000 audit: CWD cwd="/etc/service/enabled/allocate-tunnel-addrs/log" Feb 13 08:25:14.174000 audit: PATH item=0 name="/dev/fd/63" inode=51720 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.174000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.174000 audit[6791]: AVC avc: denied { write } for pid=6791 comm="tee" name="fd" dev="proc" ino=40912 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.174000 audit[6791]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc2f8c296f a2=241 a3=1b6 items=1 ppid=6754 pid=6791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.174000 audit: CWD cwd="/etc/service/enabled/cni/log" Feb 13 08:25:14.174000 audit: PATH item=0 name="/dev/fd/63" inode=41977 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.174000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.175000 audit[6790]: AVC avc: denied { write } for pid=6790 comm="tee" name="fd" dev="proc" ino=53704 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.175000 audit[6790]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffd24ef296d a2=241 a3=1b6 items=1 ppid=6752 pid=6790 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.175000 audit: CWD cwd="/etc/service/enabled/felix/log" Feb 13 08:25:14.175000 audit: PATH item=0 name="/dev/fd/63" inode=40909 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.175000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.175000 audit[6797]: AVC avc: denied { write } for pid=6797 comm="tee" name="fd" dev="proc" ino=40916 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.175000 audit[6797]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffe2a44196e a2=241 a3=1b6 items=1 ppid=6762 pid=6797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.175000 audit: CWD cwd="/etc/service/enabled/bird/log" Feb 13 08:25:14.175000 audit: PATH item=0 name="/dev/fd/63" inode=41978 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.175000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.175000 audit[6795]: AVC avc: denied { write } for pid=6795 comm="tee" name="fd" dev="proc" ino=43877 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.175000 audit[6795]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffcd771896d a2=241 a3=1b6 items=1 ppid=6756 pid=6795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.175000 audit: CWD cwd="/etc/service/enabled/bird6/log" Feb 13 08:25:14.175000 audit: PATH item=0 name="/dev/fd/63" inode=49848 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.175000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.175000 audit[6810]: AVC avc: denied { write } for pid=6810 comm="tee" name="fd" dev="proc" ino=51727 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Feb 13 08:25:14.175000 audit[6810]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7fff2cf6395e a2=241 a3=1b6 items=1 ppid=6757 pid=6810 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.175000 audit: CWD cwd="/etc/service/enabled/node-status-reporter/log" Feb 13 08:25:14.175000 audit: PATH item=0 name="/dev/fd/63" inode=45797 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:25:14.175000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit: BPF prog-id=10 op=LOAD Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff921be510 a2=70 a3=7ff094259000 items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit: BPF prog-id=10 op=UNLOAD Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit: BPF prog-id=11 op=LOAD Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff921be510 a2=70 a3=6e items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit: BPF prog-id=11 op=UNLOAD Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7fff921be4c0 a2=70 a3=7fff921be510 items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit: BPF prog-id=12 op=LOAD Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff921be4a0 a2=70 a3=7fff921be510 items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit: BPF prog-id=12 op=UNLOAD Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff921be580 a2=70 a3=0 items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff921be570 a2=70 a3=0 items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=0 a1=7fff921be5b0 a2=70 a3=0 items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { perfmon } for pid=6925 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit[6925]: AVC avc: denied { bpf } for pid=6925 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.461000 audit: BPF prog-id=13 op=LOAD Feb 13 08:25:14.461000 audit[6925]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff921be4d0 a2=70 a3=ffffffff items=0 ppid=6759 pid=6925 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.461000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Feb 13 08:25:14.463000 audit[6929]: AVC avc: denied { bpf } for pid=6929 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.463000 audit[6929]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffef30d4610 a2=70 a3=fff80800 items=0 ppid=6759 pid=6929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.463000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Feb 13 08:25:14.463000 audit[6929]: AVC avc: denied { bpf } for pid=6929 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:25:14.463000 audit[6929]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffef30d44e0 a2=70 a3=3 items=0 ppid=6759 pid=6929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.463000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Feb 13 08:25:14.480000 audit: BPF prog-id=13 op=UNLOAD Feb 13 08:25:14.501000 audit[6984]: NETFILTER_CFG table=mangle:111 family=2 entries=19 op=nft_register_chain pid=6984 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:14.501000 audit[6984]: SYSCALL arch=c000003e syscall=46 success=yes exit=6800 a0=3 a1=7ffee0320280 a2=0 a3=7ffee032026c items=0 ppid=6759 pid=6984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.501000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:14.502000 audit[6983]: NETFILTER_CFG table=raw:112 family=2 entries=19 op=nft_register_chain pid=6983 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:14.502000 audit[6983]: SYSCALL arch=c000003e syscall=46 success=yes exit=6132 a0=3 a1=7fff68701190 a2=0 a3=55f7f7a8b000 items=0 ppid=6759 pid=6983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.502000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:14.503000 audit[6985]: NETFILTER_CFG table=nat:113 family=2 entries=16 op=nft_register_chain pid=6985 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:14.503000 audit[6985]: SYSCALL arch=c000003e syscall=46 success=yes exit=5188 a0=3 a1=7ffc096f6ab0 a2=0 a3=55af4c594000 items=0 ppid=6759 pid=6985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.503000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:14.505000 audit[6988]: NETFILTER_CFG table=filter:114 family=2 entries=39 op=nft_register_chain pid=6988 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:14.505000 audit[6988]: SYSCALL arch=c000003e syscall=46 success=yes exit=18472 a0=3 a1=7fff696e2630 a2=0 a3=559f47282000 items=0 ppid=6759 pid=6988 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.505000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:14.657519 env[1563]: time="2024-02-13T08:25:14.657247485Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.748 [INFO][7007] k8s.go 578: Cleaning up netns ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.749 [INFO][7007] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" iface="eth0" netns="/var/run/netns/cni-27488907-48e9-1534-318b-507354b38ace" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.749 [INFO][7007] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" iface="eth0" netns="/var/run/netns/cni-27488907-48e9-1534-318b-507354b38ace" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.749 [INFO][7007] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" iface="eth0" netns="/var/run/netns/cni-27488907-48e9-1534-318b-507354b38ace" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.749 [INFO][7007] k8s.go 585: Releasing IP address(es) ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.749 [INFO][7007] utils.go 188: Calico CNI releasing IP address ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.793 [INFO][7025] ipam_plugin.go 415: Releasing address using handleID ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.793 [INFO][7025] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.793 [INFO][7025] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.805 [WARNING][7025] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.805 [INFO][7025] ipam_plugin.go 443: Releasing address using workloadID ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.807 [INFO][7025] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:14.811589 env[1563]: 2024-02-13 08:25:14.809 [INFO][7007] k8s.go 591: Teardown processing complete. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:14.812817 env[1563]: time="2024-02-13T08:25:14.811789712Z" level=info msg="TearDown network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" successfully" Feb 13 08:25:14.812817 env[1563]: time="2024-02-13T08:25:14.811838462Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" returns successfully" Feb 13 08:25:14.813092 env[1563]: time="2024-02-13T08:25:14.812830264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-vhpxp,Uid:7d5a107e-32fc-46ef-9ba6-381664363494,Namespace:kube-system,Attempt:1,}" Feb 13 08:25:14.816539 systemd[1]: run-netns-cni\x2d27488907\x2d48e9\x2d1534\x2d318b\x2d507354b38ace.mount: Deactivated successfully. Feb 13 08:25:14.911380 systemd-networkd[1419]: cali959243e1537: Link UP Feb 13 08:25:14.939901 systemd-networkd[1419]: cali959243e1537: Gained carrier Feb 13 08:25:14.940006 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali959243e1537: link becomes ready Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.859 [INFO][7044] plugin.go 327: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0 coredns-787d4945fb- kube-system 7d5a107e-32fc-46ef-9ba6-381664363494 1631 0 2024-02-13 08:17:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:787d4945fb projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3510.3.2-a-56b02fc11a coredns-787d4945fb-vhpxp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali959243e1537 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.859 [INFO][7044] k8s.go 76: Extracted identifiers for CmdAddK8s ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.880 [INFO][7095] ipam_plugin.go 228: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" HandleID="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.888 [INFO][7095] ipam_plugin.go 268: Auto assigning IP ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" HandleID="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004f9200), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3510.3.2-a-56b02fc11a", "pod":"coredns-787d4945fb-vhpxp", "timestamp":"2024-02-13 08:25:14.880315903 +0000 UTC"}, Hostname:"ci-3510.3.2-a-56b02fc11a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.888 [INFO][7095] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.888 [INFO][7095] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.889 [INFO][7095] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.2-a-56b02fc11a' Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.890 [INFO][7095] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.894 [INFO][7095] ipam.go 372: Looking up existing affinities for host host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.897 [INFO][7095] ipam.go 489: Trying affinity for 192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.899 [INFO][7095] ipam.go 155: Attempting to load block cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.901 [INFO][7095] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.901 [INFO][7095] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.100.128/26 handle="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.902 [INFO][7095] ipam.go 1682: Creating new handle: k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35 Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.904 [INFO][7095] ipam.go 1203: Writing block in order to claim IPs block=192.168.100.128/26 handle="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.908 [INFO][7095] ipam.go 1216: Successfully claimed IPs: [192.168.100.129/26] block=192.168.100.128/26 handle="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.908 [INFO][7095] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.100.129/26] handle="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.908 [INFO][7095] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:14.945823 env[1563]: 2024-02-13 08:25:14.908 [INFO][7095] ipam_plugin.go 286: Calico CNI IPAM assigned addresses IPv4=[192.168.100.129/26] IPv6=[] ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" HandleID="k8s-pod-network.6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.946367 env[1563]: 2024-02-13 08:25:14.910 [INFO][7044] k8s.go 385: Populated endpoint ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"7d5a107e-32fc-46ef-9ba6-381664363494", ResourceVersion:"1631", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"", Pod:"coredns-787d4945fb-vhpxp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali959243e1537", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:14.946367 env[1563]: 2024-02-13 08:25:14.910 [INFO][7044] k8s.go 386: Calico CNI using IPs: [192.168.100.129/32] ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.946367 env[1563]: 2024-02-13 08:25:14.910 [INFO][7044] dataplane_linux.go 68: Setting the host side veth name to cali959243e1537 ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.946367 env[1563]: 2024-02-13 08:25:14.939 [INFO][7044] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.946367 env[1563]: 2024-02-13 08:25:14.940 [INFO][7044] k8s.go 413: Added Mac, interface name, and active container ID to endpoint ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"7d5a107e-32fc-46ef-9ba6-381664363494", ResourceVersion:"1631", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35", Pod:"coredns-787d4945fb-vhpxp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali959243e1537", MAC:"a6:73:8d:e5:f8:49", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:14.946367 env[1563]: 2024-02-13 08:25:14.944 [INFO][7044] k8s.go 491: Wrote updated endpoint to datastore ContainerID="6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35" Namespace="kube-system" Pod="coredns-787d4945fb-vhpxp" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:14.952093 env[1563]: time="2024-02-13T08:25:14.951987892Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:25:14.952093 env[1563]: time="2024-02-13T08:25:14.952018267Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:25:14.952093 env[1563]: time="2024-02-13T08:25:14.952026561Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:25:14.952220 env[1563]: time="2024-02-13T08:25:14.952187261Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35 pid=7130 runtime=io.containerd.runc.v2 Feb 13 08:25:14.975000 audit[7161]: NETFILTER_CFG table=filter:115 family=2 entries=36 op=nft_register_chain pid=7161 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:14.975000 audit[7161]: SYSCALL arch=c000003e syscall=46 success=yes exit=19908 a0=3 a1=7ffd63fbfbd0 a2=0 a3=7ffd63fbfbbc items=0 ppid=6759 pid=7161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:14.975000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:14.986111 env[1563]: time="2024-02-13T08:25:14.986084324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-vhpxp,Uid:7d5a107e-32fc-46ef-9ba6-381664363494,Namespace:kube-system,Attempt:1,} returns sandbox id \"6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35\"" Feb 13 08:25:14.987581 env[1563]: time="2024-02-13T08:25:14.987562996Z" level=info msg="CreateContainer within sandbox \"6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 08:25:14.992139 env[1563]: time="2024-02-13T08:25:14.992091728Z" level=info msg="CreateContainer within sandbox \"6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"161cedfaa4208ac9ca00d5444e301e19954057d72eb00b5cac1347af036e118e\"" Feb 13 08:25:14.992414 env[1563]: time="2024-02-13T08:25:14.992348203Z" level=info msg="StartContainer for \"161cedfaa4208ac9ca00d5444e301e19954057d72eb00b5cac1347af036e118e\"" Feb 13 08:25:15.015228 env[1563]: time="2024-02-13T08:25:15.015172734Z" level=info msg="StartContainer for \"161cedfaa4208ac9ca00d5444e301e19954057d72eb00b5cac1347af036e118e\" returns successfully" Feb 13 08:25:15.258428 systemd-networkd[1419]: vxlan.calico: Link UP Feb 13 08:25:15.258440 systemd-networkd[1419]: vxlan.calico: Gained carrier Feb 13 08:25:15.843126 kubelet[2738]: I0213 08:25:15.843066 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-787d4945fb-vhpxp" podStartSLOduration=441.843029791 pod.CreationTimestamp="2024-02-13 08:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:25:15.842985473 +0000 UTC m=+455.369894548" watchObservedRunningTime="2024-02-13 08:25:15.843029791 +0000 UTC m=+455.369938858" Feb 13 08:25:15.862000 audit[7246]: NETFILTER_CFG table=filter:116 family=2 entries=12 op=nft_register_rule pid=7246 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:15.862000 audit[7246]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7fffcfbb60b0 a2=0 a3=7fffcfbb609c items=0 ppid=3000 pid=7246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:15.862000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:15.863000 audit[7246]: NETFILTER_CFG table=nat:117 family=2 entries=30 op=nft_register_rule pid=7246 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:15.863000 audit[7246]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7fffcfbb60b0 a2=0 a3=7fffcfbb609c items=0 ppid=3000 pid=7246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:15.863000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:15.964000 audit[7272]: NETFILTER_CFG table=filter:118 family=2 entries=9 op=nft_register_rule pid=7272 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:15.964000 audit[7272]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffe58891880 a2=0 a3=7ffe5889186c items=0 ppid=3000 pid=7272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:15.964000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:15.968000 audit[7272]: NETFILTER_CFG table=nat:119 family=2 entries=51 op=nft_register_chain pid=7272 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:15.968000 audit[7272]: SYSCALL arch=c000003e syscall=46 success=yes exit=19324 a0=3 a1=7ffe58891880 a2=0 a3=7ffe5889186c items=0 ppid=3000 pid=7272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:15.968000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:16.050230 systemd-networkd[1419]: cali959243e1537: Gained IPv6LL Feb 13 08:25:16.306300 systemd-networkd[1419]: vxlan.calico: Gained IPv6LL Feb 13 08:25:18.656753 env[1563]: time="2024-02-13T08:25:18.656619734Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.726 [INFO][7292] k8s.go 578: Cleaning up netns ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.726 [INFO][7292] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" iface="eth0" netns="/var/run/netns/cni-629e3224-ea80-fde8-2984-0597f3da170c" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.727 [INFO][7292] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" iface="eth0" netns="/var/run/netns/cni-629e3224-ea80-fde8-2984-0597f3da170c" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.727 [INFO][7292] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" iface="eth0" netns="/var/run/netns/cni-629e3224-ea80-fde8-2984-0597f3da170c" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.727 [INFO][7292] k8s.go 585: Releasing IP address(es) ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.727 [INFO][7292] utils.go 188: Calico CNI releasing IP address ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.745 [INFO][7309] ipam_plugin.go 415: Releasing address using handleID ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.745 [INFO][7309] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.746 [INFO][7309] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.753 [WARNING][7309] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.753 [INFO][7309] ipam_plugin.go 443: Releasing address using workloadID ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.755 [INFO][7309] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:18.757337 env[1563]: 2024-02-13 08:25:18.756 [INFO][7292] k8s.go 591: Teardown processing complete. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:18.757919 env[1563]: time="2024-02-13T08:25:18.757464409Z" level=info msg="TearDown network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" successfully" Feb 13 08:25:18.757919 env[1563]: time="2024-02-13T08:25:18.757502928Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" returns successfully" Feb 13 08:25:18.758133 env[1563]: time="2024-02-13T08:25:18.758076270Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mrqvb,Uid:8100728f-8434-43ba-8770-5d3f00e1f18f,Namespace:calico-system,Attempt:1,}" Feb 13 08:25:18.760587 systemd[1]: run-netns-cni\x2d629e3224\x2dea80\x2dfde8\x2d2984\x2d0597f3da170c.mount: Deactivated successfully. Feb 13 08:25:18.822212 systemd[1]: Started sshd@33-145.40.67.79:22-139.178.68.195:46972.service. Feb 13 08:25:18.821000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.67.79:22-139.178.68.195:46972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:18.848547 kernel: kauditd_printk_skb: 134 callbacks suppressed Feb 13 08:25:18.848657 kernel: audit: type=1130 audit(1707812718.821:532): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.67.79:22-139.178.68.195:46972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:18.945998 systemd-networkd[1419]: cali29cf82678dc: Link UP Feb 13 08:25:19.000842 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Feb 13 08:25:19.000872 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali29cf82678dc: link becomes ready Feb 13 08:25:19.000887 systemd-networkd[1419]: cali29cf82678dc: Gained carrier Feb 13 08:25:18.999000 audit[7351]: USER_ACCT pid=7351 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.000974 sshd[7351]: Accepted publickey for core from 139.178.68.195 port 46972 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:19.001113 kernel: audit: type=1101 audit(1707812718.999:533): pid=7351 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.003295 sshd[7351]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:19.005904 systemd-logind[1548]: New session 34 of user core. Feb 13 08:25:19.006893 systemd[1]: Started session-34.scope. Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.812 [INFO][7328] plugin.go 327: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0 csi-node-driver- calico-system 8100728f-8434-43ba-8770-5d3f00e1f18f 1653 0 2024-02-13 08:17:59 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:7c77f88967 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s ci-3510.3.2-a-56b02fc11a csi-node-driver-mrqvb eth0 default [] [] [kns.calico-system ksa.calico-system.default] cali29cf82678dc [] []}} ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.812 [INFO][7328] k8s.go 76: Extracted identifiers for CmdAddK8s ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.838 [INFO][7346] ipam_plugin.go 228: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" HandleID="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.852 [INFO][7346] ipam_plugin.go 268: Auto assigning IP ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" HandleID="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000514a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3510.3.2-a-56b02fc11a", "pod":"csi-node-driver-mrqvb", "timestamp":"2024-02-13 08:25:18.838680732 +0000 UTC"}, Hostname:"ci-3510.3.2-a-56b02fc11a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.852 [INFO][7346] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.852 [INFO][7346] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.852 [INFO][7346] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.2-a-56b02fc11a' Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.854 [INFO][7346] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.857 [INFO][7346] ipam.go 372: Looking up existing affinities for host host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.935 [INFO][7346] ipam.go 489: Trying affinity for 192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.937 [INFO][7346] ipam.go 155: Attempting to load block cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.938 [INFO][7346] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.938 [INFO][7346] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.100.128/26 handle="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.939 [INFO][7346] ipam.go 1682: Creating new handle: k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4 Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.941 [INFO][7346] ipam.go 1203: Writing block in order to claim IPs block=192.168.100.128/26 handle="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.944 [INFO][7346] ipam.go 1216: Successfully claimed IPs: [192.168.100.130/26] block=192.168.100.128/26 handle="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.944 [INFO][7346] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.100.130/26] handle="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.944 [INFO][7346] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:19.010695 env[1563]: 2024-02-13 08:25:18.944 [INFO][7346] ipam_plugin.go 286: Calico CNI IPAM assigned addresses IPv4=[192.168.100.130/26] IPv6=[] ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" HandleID="k8s-pod-network.e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.011312 env[1563]: 2024-02-13 08:25:18.945 [INFO][7328] k8s.go 385: Populated endpoint ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8100728f-8434-43ba-8770-5d3f00e1f18f", ResourceVersion:"1653", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7c77f88967", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"", Pod:"csi-node-driver-mrqvb", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.100.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali29cf82678dc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:19.011312 env[1563]: 2024-02-13 08:25:18.945 [INFO][7328] k8s.go 386: Calico CNI using IPs: [192.168.100.130/32] ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.011312 env[1563]: 2024-02-13 08:25:18.945 [INFO][7328] dataplane_linux.go 68: Setting the host side veth name to cali29cf82678dc ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.011312 env[1563]: 2024-02-13 08:25:19.000 [INFO][7328] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.011312 env[1563]: 2024-02-13 08:25:19.001 [INFO][7328] k8s.go 413: Added Mac, interface name, and active container ID to endpoint ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8100728f-8434-43ba-8770-5d3f00e1f18f", ResourceVersion:"1653", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7c77f88967", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4", Pod:"csi-node-driver-mrqvb", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.100.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali29cf82678dc", MAC:"ee:81:9e:b3:04:00", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:19.011312 env[1563]: 2024-02-13 08:25:19.006 [INFO][7328] k8s.go 491: Wrote updated endpoint to datastore ContainerID="e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4" Namespace="calico-system" Pod="csi-node-driver-mrqvb" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:19.022951 env[1563]: time="2024-02-13T08:25:19.022916758Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:25:19.022951 env[1563]: time="2024-02-13T08:25:19.022936948Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:25:19.022951 env[1563]: time="2024-02-13T08:25:19.022943927Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:25:19.023120 env[1563]: time="2024-02-13T08:25:19.023013241Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4 pid=7391 runtime=io.containerd.runc.v2 Feb 13 08:25:19.086494 sshd[7351]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:19.087770 systemd[1]: sshd@33-145.40.67.79:22-139.178.68.195:46972.service: Deactivated successfully. Feb 13 08:25:19.088315 systemd-logind[1548]: Session 34 logged out. Waiting for processes to exit. Feb 13 08:25:19.088357 systemd[1]: session-34.scope: Deactivated successfully. Feb 13 08:25:19.088813 systemd-logind[1548]: Removed session 34. Feb 13 08:25:19.002000 audit[7351]: CRED_ACQ pid=7351 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.178146 kernel: audit: type=1103 audit(1707812719.002:534): pid=7351 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.178220 kernel: audit: type=1006 audit(1707812719.002:535): pid=7351 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Feb 13 08:25:19.002000 audit[7351]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffda21b8520 a2=3 a3=0 items=0 ppid=1 pid=7351 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:19.002000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:19.354316 kernel: audit: type=1300 audit(1707812719.002:535): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffda21b8520 a2=3 a3=0 items=0 ppid=1 pid=7351 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:19.354360 kernel: audit: type=1327 audit(1707812719.002:535): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:19.354384 kernel: audit: type=1105 audit(1707812719.008:536): pid=7351 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.008000 audit[7351]: USER_START pid=7351 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.359423 env[1563]: time="2024-02-13T08:25:19.359393309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mrqvb,Uid:8100728f-8434-43ba-8770-5d3f00e1f18f,Namespace:calico-system,Attempt:1,} returns sandbox id \"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4\"" Feb 13 08:25:19.360068 env[1563]: time="2024-02-13T08:25:19.360053470Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.27.0\"" Feb 13 08:25:19.009000 audit[7373]: CRED_ACQ pid=7373 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.534658 kernel: audit: type=1103 audit(1707812719.009:537): pid=7373 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.534730 kernel: audit: type=1325 audit(1707812719.013:538): table=filter:120 family=2 entries=40 op=nft_register_chain pid=7384 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:19.013000 audit[7384]: NETFILTER_CFG table=filter:120 family=2 entries=40 op=nft_register_chain pid=7384 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:19.013000 audit[7384]: SYSCALL arch=c000003e syscall=46 success=yes exit=21096 a0=3 a1=7ffce444d580 a2=0 a3=7ffce444d56c items=0 ppid=6759 pid=7384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:19.656320 env[1563]: time="2024-02-13T08:25:19.656296909Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:25:19.687246 kernel: audit: type=1300 audit(1707812719.013:538): arch=c000003e syscall=46 success=yes exit=21096 a0=3 a1=7ffce444d580 a2=0 a3=7ffce444d56c items=0 ppid=6759 pid=7384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:19.013000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:19.086000 audit[7351]: USER_END pid=7351 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.086000 audit[7351]: CRED_DISP pid=7351 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:19.086000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.67.79:22-139.178.68.195:46972 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.677 [INFO][7467] k8s.go 578: Cleaning up netns ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.677 [INFO][7467] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" iface="eth0" netns="/var/run/netns/cni-788c6bab-72b6-33bf-ee30-2f296d1b8922" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.678 [INFO][7467] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" iface="eth0" netns="/var/run/netns/cni-788c6bab-72b6-33bf-ee30-2f296d1b8922" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.678 [INFO][7467] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" iface="eth0" netns="/var/run/netns/cni-788c6bab-72b6-33bf-ee30-2f296d1b8922" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.678 [INFO][7467] k8s.go 585: Releasing IP address(es) ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.678 [INFO][7467] utils.go 188: Calico CNI releasing IP address ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.687 [INFO][7482] ipam_plugin.go 415: Releasing address using handleID ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.687 [INFO][7482] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.687 [INFO][7482] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.691 [WARNING][7482] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.691 [INFO][7482] ipam_plugin.go 443: Releasing address using workloadID ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.693 [INFO][7482] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:19.694389 env[1563]: 2024-02-13 08:25:19.693 [INFO][7467] k8s.go 591: Teardown processing complete. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:19.694774 env[1563]: time="2024-02-13T08:25:19.694463982Z" level=info msg="TearDown network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" successfully" Feb 13 08:25:19.694774 env[1563]: time="2024-02-13T08:25:19.694483177Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" returns successfully" Feb 13 08:25:19.694849 env[1563]: time="2024-02-13T08:25:19.694836455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-646c8f86fc-xkf58,Uid:d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d,Namespace:calico-system,Attempt:1,}" Feb 13 08:25:19.750856 systemd-networkd[1419]: cali0bd90e1344e: Link UP Feb 13 08:25:19.760729 systemd[1]: run-netns-cni\x2d788c6bab\x2d72b6\x2d33bf\x2dee30\x2d2f296d1b8922.mount: Deactivated successfully. Feb 13 08:25:19.778964 systemd-networkd[1419]: cali0bd90e1344e: Gained carrier Feb 13 08:25:19.779025 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali0bd90e1344e: link becomes ready Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.714 [INFO][7498] plugin.go 327: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0 calico-kube-controllers-646c8f86fc- calico-system d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d 1662 0 2024-02-13 08:17:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:646c8f86fc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-3510.3.2-a-56b02fc11a calico-kube-controllers-646c8f86fc-xkf58 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0bd90e1344e [] []}} ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.714 [INFO][7498] k8s.go 76: Extracted identifiers for CmdAddK8s ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.727 [INFO][7516] ipam_plugin.go 228: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" HandleID="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.733 [INFO][7516] ipam_plugin.go 268: Auto assigning IP ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" HandleID="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00051ed00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3510.3.2-a-56b02fc11a", "pod":"calico-kube-controllers-646c8f86fc-xkf58", "timestamp":"2024-02-13 08:25:19.727777467 +0000 UTC"}, Hostname:"ci-3510.3.2-a-56b02fc11a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.733 [INFO][7516] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.733 [INFO][7516] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.733 [INFO][7516] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.2-a-56b02fc11a' Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.735 [INFO][7516] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.738 [INFO][7516] ipam.go 372: Looking up existing affinities for host host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.740 [INFO][7516] ipam.go 489: Trying affinity for 192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.742 [INFO][7516] ipam.go 155: Attempting to load block cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.743 [INFO][7516] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.743 [INFO][7516] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.100.128/26 handle="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.744 [INFO][7516] ipam.go 1682: Creating new handle: k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.746 [INFO][7516] ipam.go 1203: Writing block in order to claim IPs block=192.168.100.128/26 handle="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.749 [INFO][7516] ipam.go 1216: Successfully claimed IPs: [192.168.100.131/26] block=192.168.100.128/26 handle="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.749 [INFO][7516] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.100.131/26] handle="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.749 [INFO][7516] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:19.784221 env[1563]: 2024-02-13 08:25:19.749 [INFO][7516] ipam_plugin.go 286: Calico CNI IPAM assigned addresses IPv4=[192.168.100.131/26] IPv6=[] ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" HandleID="k8s-pod-network.d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.784670 env[1563]: 2024-02-13 08:25:19.750 [INFO][7498] k8s.go 385: Populated endpoint ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0", GenerateName:"calico-kube-controllers-646c8f86fc-", Namespace:"calico-system", SelfLink:"", UID:"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d", ResourceVersion:"1662", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"646c8f86fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"", Pod:"calico-kube-controllers-646c8f86fc-xkf58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0bd90e1344e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:19.784670 env[1563]: 2024-02-13 08:25:19.750 [INFO][7498] k8s.go 386: Calico CNI using IPs: [192.168.100.131/32] ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.784670 env[1563]: 2024-02-13 08:25:19.750 [INFO][7498] dataplane_linux.go 68: Setting the host side veth name to cali0bd90e1344e ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.784670 env[1563]: 2024-02-13 08:25:19.778 [INFO][7498] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.784670 env[1563]: 2024-02-13 08:25:19.779 [INFO][7498] k8s.go 413: Added Mac, interface name, and active container ID to endpoint ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0", GenerateName:"calico-kube-controllers-646c8f86fc-", Namespace:"calico-system", SelfLink:"", UID:"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d", ResourceVersion:"1662", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"646c8f86fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def", Pod:"calico-kube-controllers-646c8f86fc-xkf58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0bd90e1344e", MAC:"a2:52:80:5c:05:26", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:19.784670 env[1563]: 2024-02-13 08:25:19.783 [INFO][7498] k8s.go 491: Wrote updated endpoint to datastore ContainerID="d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def" Namespace="calico-system" Pod="calico-kube-controllers-646c8f86fc-xkf58" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:19.790017 env[1563]: time="2024-02-13T08:25:19.789977787Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:25:19.790017 env[1563]: time="2024-02-13T08:25:19.790007052Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:25:19.790017 env[1563]: time="2024-02-13T08:25:19.790017632Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:25:19.790175 env[1563]: time="2024-02-13T08:25:19.790095299Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def pid=7555 runtime=io.containerd.runc.v2 Feb 13 08:25:19.789000 audit[7560]: NETFILTER_CFG table=filter:121 family=2 entries=38 op=nft_register_chain pid=7560 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:19.789000 audit[7560]: SYSCALL arch=c000003e syscall=46 success=yes exit=19508 a0=3 a1=7ffe5f2cdf90 a2=0 a3=7ffe5f2cdf7c items=0 ppid=6759 pid=7560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:19.789000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:19.816893 env[1563]: time="2024-02-13T08:25:19.816870566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-646c8f86fc-xkf58,Uid:d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d,Namespace:calico-system,Attempt:1,} returns sandbox id \"d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def\"" Feb 13 08:25:20.402230 systemd-networkd[1419]: cali29cf82678dc: Gained IPv6LL Feb 13 08:25:20.978332 systemd-networkd[1419]: cali0bd90e1344e: Gained IPv6LL Feb 13 08:25:21.656565 env[1563]: time="2024-02-13T08:25:21.656426928Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.746 [INFO][7604] k8s.go 578: Cleaning up netns ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.747 [INFO][7604] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" iface="eth0" netns="/var/run/netns/cni-a3dade4d-23e6-3176-17ae-5146f6856fc4" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.747 [INFO][7604] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" iface="eth0" netns="/var/run/netns/cni-a3dade4d-23e6-3176-17ae-5146f6856fc4" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.747 [INFO][7604] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" iface="eth0" netns="/var/run/netns/cni-a3dade4d-23e6-3176-17ae-5146f6856fc4" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.747 [INFO][7604] k8s.go 585: Releasing IP address(es) ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.747 [INFO][7604] utils.go 188: Calico CNI releasing IP address ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.792 [INFO][7625] ipam_plugin.go 415: Releasing address using handleID ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.793 [INFO][7625] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.793 [INFO][7625] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.804 [WARNING][7625] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.804 [INFO][7625] ipam_plugin.go 443: Releasing address using workloadID ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.806 [INFO][7625] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:21.810307 env[1563]: 2024-02-13 08:25:21.808 [INFO][7604] k8s.go 591: Teardown processing complete. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:21.811661 env[1563]: time="2024-02-13T08:25:21.810536898Z" level=info msg="TearDown network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" successfully" Feb 13 08:25:21.811661 env[1563]: time="2024-02-13T08:25:21.810609326Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" returns successfully" Feb 13 08:25:21.811661 env[1563]: time="2024-02-13T08:25:21.811537446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-tr9dh,Uid:26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0,Namespace:kube-system,Attempt:1,}" Feb 13 08:25:21.818300 systemd[1]: run-netns-cni\x2da3dade4d\x2d23e6\x2d3176\x2d17ae\x2d5146f6856fc4.mount: Deactivated successfully. Feb 13 08:25:21.941677 systemd-networkd[1419]: cali0fde6d0cc1b: Link UP Feb 13 08:25:21.997804 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Feb 13 08:25:21.997850 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali0fde6d0cc1b: link becomes ready Feb 13 08:25:21.997923 systemd-networkd[1419]: cali0fde6d0cc1b: Gained carrier Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.862 [INFO][7644] plugin.go 327: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0 coredns-787d4945fb- kube-system 26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0 1671 0 2024-02-13 08:17:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:787d4945fb projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3510.3.2-a-56b02fc11a coredns-787d4945fb-tr9dh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0fde6d0cc1b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.862 [INFO][7644] k8s.go 76: Extracted identifiers for CmdAddK8s ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.894 [INFO][7668] ipam_plugin.go 228: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" HandleID="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.907 [INFO][7668] ipam_plugin.go 268: Auto assigning IP ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" HandleID="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b5910), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3510.3.2-a-56b02fc11a", "pod":"coredns-787d4945fb-tr9dh", "timestamp":"2024-02-13 08:25:21.894095562 +0000 UTC"}, Hostname:"ci-3510.3.2-a-56b02fc11a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.907 [INFO][7668] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.908 [INFO][7668] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.908 [INFO][7668] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.2-a-56b02fc11a' Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.910 [INFO][7668] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.914 [INFO][7668] ipam.go 372: Looking up existing affinities for host host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.920 [INFO][7668] ipam.go 489: Trying affinity for 192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.922 [INFO][7668] ipam.go 155: Attempting to load block cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.925 [INFO][7668] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.925 [INFO][7668] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.100.128/26 handle="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.927 [INFO][7668] ipam.go 1682: Creating new handle: k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.932 [INFO][7668] ipam.go 1203: Writing block in order to claim IPs block=192.168.100.128/26 handle="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.938 [INFO][7668] ipam.go 1216: Successfully claimed IPs: [192.168.100.132/26] block=192.168.100.128/26 handle="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.938 [INFO][7668] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.100.132/26] handle="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.938 [INFO][7668] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:22.003279 env[1563]: 2024-02-13 08:25:21.938 [INFO][7668] ipam_plugin.go 286: Calico CNI IPAM assigned addresses IPv4=[192.168.100.132/26] IPv6=[] ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" HandleID="k8s-pod-network.a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.003684 env[1563]: 2024-02-13 08:25:21.939 [INFO][7644] k8s.go 385: Populated endpoint ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0", ResourceVersion:"1671", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"", Pod:"coredns-787d4945fb-tr9dh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0fde6d0cc1b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:22.003684 env[1563]: 2024-02-13 08:25:21.940 [INFO][7644] k8s.go 386: Calico CNI using IPs: [192.168.100.132/32] ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.003684 env[1563]: 2024-02-13 08:25:21.940 [INFO][7644] dataplane_linux.go 68: Setting the host side veth name to cali0fde6d0cc1b ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.003684 env[1563]: 2024-02-13 08:25:21.997 [INFO][7644] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.003684 env[1563]: 2024-02-13 08:25:21.997 [INFO][7644] k8s.go 413: Added Mac, interface name, and active container ID to endpoint ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0", ResourceVersion:"1671", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c", Pod:"coredns-787d4945fb-tr9dh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0fde6d0cc1b", MAC:"d6:8b:3e:39:83:b3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:22.003684 env[1563]: 2024-02-13 08:25:22.002 [INFO][7644] k8s.go 491: Wrote updated endpoint to datastore ContainerID="a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c" Namespace="kube-system" Pod="coredns-787d4945fb-tr9dh" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:22.008923 env[1563]: time="2024-02-13T08:25:22.008867070Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:25:22.008923 env[1563]: time="2024-02-13T08:25:22.008890130Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:25:22.008923 env[1563]: time="2024-02-13T08:25:22.008900216Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:25:22.009088 env[1563]: time="2024-02-13T08:25:22.008962868Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c pid=7706 runtime=io.containerd.runc.v2 Feb 13 08:25:22.008000 audit[7708]: NETFILTER_CFG table=filter:122 family=2 entries=44 op=nft_register_chain pid=7708 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:22.008000 audit[7708]: SYSCALL arch=c000003e syscall=46 success=yes exit=21940 a0=3 a1=7ffc220e1170 a2=0 a3=7ffc220e115c items=0 ppid=6759 pid=7708 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:22.008000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:22.035827 env[1563]: time="2024-02-13T08:25:22.035798826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-tr9dh,Uid:26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0,Namespace:kube-system,Attempt:1,} returns sandbox id \"a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c\"" Feb 13 08:25:22.036997 env[1563]: time="2024-02-13T08:25:22.036978920Z" level=info msg="CreateContainer within sandbox \"a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 08:25:22.040985 env[1563]: time="2024-02-13T08:25:22.040970260Z" level=info msg="CreateContainer within sandbox \"a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a575b402d0e4d8e92f485e6fb67586034d3a1a759036369c9eb835e809d516bc\"" Feb 13 08:25:22.041224 env[1563]: time="2024-02-13T08:25:22.041170945Z" level=info msg="StartContainer for \"a575b402d0e4d8e92f485e6fb67586034d3a1a759036369c9eb835e809d516bc\"" Feb 13 08:25:22.061045 env[1563]: time="2024-02-13T08:25:22.061001758Z" level=info msg="StartContainer for \"a575b402d0e4d8e92f485e6fb67586034d3a1a759036369c9eb835e809d516bc\" returns successfully" Feb 13 08:25:22.853926 kubelet[2738]: I0213 08:25:22.853899 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-787d4945fb-tr9dh" podStartSLOduration=448.853867786 pod.CreationTimestamp="2024-02-13 08:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:25:22.853380653 +0000 UTC m=+462.380289732" watchObservedRunningTime="2024-02-13 08:25:22.853867786 +0000 UTC m=+462.380776858" Feb 13 08:25:22.901000 audit[7814]: NETFILTER_CFG table=filter:123 family=2 entries=6 op=nft_register_rule pid=7814 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:22.901000 audit[7814]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffd638eff90 a2=0 a3=7ffd638eff7c items=0 ppid=3000 pid=7814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:22.901000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:22.903000 audit[7814]: NETFILTER_CFG table=nat:124 family=2 entries=60 op=nft_register_rule pid=7814 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:22.903000 audit[7814]: SYSCALL arch=c000003e syscall=46 success=yes exit=19324 a0=3 a1=7ffd638eff90 a2=0 a3=7ffd638eff7c items=0 ppid=3000 pid=7814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:22.903000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:23.006000 audit[7840]: NETFILTER_CFG table=filter:125 family=2 entries=6 op=nft_register_rule pid=7840 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:23.006000 audit[7840]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7fff01dafa70 a2=0 a3=7fff01dafa5c items=0 ppid=3000 pid=7840 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:23.006000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:23.055000 audit[7840]: NETFILTER_CFG table=nat:126 family=2 entries=72 op=nft_register_chain pid=7840 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:23.055000 audit[7840]: SYSCALL arch=c000003e syscall=46 success=yes exit=24988 a0=3 a1=7fff01dafa70 a2=0 a3=7fff01dafa5c items=0 ppid=3000 pid=7840 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:23.055000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:23.666268 systemd-networkd[1419]: cali0fde6d0cc1b: Gained IPv6LL Feb 13 08:25:24.092572 systemd[1]: Started sshd@34-145.40.67.79:22-139.178.68.195:46984.service. Feb 13 08:25:24.091000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.67.79:22-139.178.68.195:46984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:24.118438 kernel: kauditd_printk_skb: 22 callbacks suppressed Feb 13 08:25:24.118492 kernel: audit: type=1130 audit(1707812724.091:548): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.67.79:22-139.178.68.195:46984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:24.138928 sshd[7842]: Accepted publickey for core from 139.178.68.195 port 46984 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:24.140297 sshd[7842]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:24.142519 systemd-logind[1548]: New session 35 of user core. Feb 13 08:25:24.143099 systemd[1]: Started session-35.scope. Feb 13 08:25:24.137000 audit[7842]: USER_ACCT pid=7842 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.222109 sshd[7842]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:24.223574 systemd[1]: sshd@34-145.40.67.79:22-139.178.68.195:46984.service: Deactivated successfully. Feb 13 08:25:24.224227 systemd[1]: session-35.scope: Deactivated successfully. Feb 13 08:25:24.224237 systemd-logind[1548]: Session 35 logged out. Waiting for processes to exit. Feb 13 08:25:24.224813 systemd-logind[1548]: Removed session 35. Feb 13 08:25:24.293370 kernel: audit: type=1101 audit(1707812724.137:549): pid=7842 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.293400 kernel: audit: type=1103 audit(1707812724.139:550): pid=7842 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.139000 audit[7842]: CRED_ACQ pid=7842 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.356464 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2011583184.mount: Deactivated successfully. Feb 13 08:25:24.435697 kernel: audit: type=1006 audit(1707812724.139:551): pid=7842 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Feb 13 08:25:24.435753 kernel: audit: type=1300 audit(1707812724.139:551): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc685bd150 a2=3 a3=0 items=0 ppid=1 pid=7842 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:24.139000 audit[7842]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc685bd150 a2=3 a3=0 items=0 ppid=1 pid=7842 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:24.523943 kernel: audit: type=1327 audit(1707812724.139:551): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:24.139000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:24.553401 kernel: audit: type=1105 audit(1707812724.144:552): pid=7842 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.144000 audit[7842]: USER_START pid=7842 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.145000 audit[7845]: CRED_ACQ pid=7845 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.731998 kernel: audit: type=1103 audit(1707812724.145:553): pid=7845 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.732029 kernel: audit: type=1106 audit(1707812724.221:554): pid=7842 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.221000 audit[7842]: USER_END pid=7842 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.825459 kernel: audit: type=1104 audit(1707812724.221:555): pid=7842 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.221000 audit[7842]: CRED_DISP pid=7842 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:24.222000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.67.79:22-139.178.68.195:46984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:29.229238 systemd[1]: Started sshd@35-145.40.67.79:22-139.178.68.195:51230.service. Feb 13 08:25:29.228000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.67.79:22-139.178.68.195:51230 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:29.255912 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:29.255966 kernel: audit: type=1130 audit(1707812729.228:557): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.67.79:22-139.178.68.195:51230 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:29.363000 audit[7880]: USER_ACCT pid=7880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.364804 sshd[7880]: Accepted publickey for core from 139.178.68.195 port 51230 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:29.368806 sshd[7880]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:29.373268 systemd-logind[1548]: New session 36 of user core. Feb 13 08:25:29.373908 systemd[1]: Started session-36.scope. Feb 13 08:25:29.366000 audit[7880]: CRED_ACQ pid=7880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.454876 sshd[7880]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:29.456306 systemd[1]: sshd@35-145.40.67.79:22-139.178.68.195:51230.service: Deactivated successfully. Feb 13 08:25:29.456904 systemd-logind[1548]: Session 36 logged out. Waiting for processes to exit. Feb 13 08:25:29.456959 systemd[1]: session-36.scope: Deactivated successfully. Feb 13 08:25:29.457414 systemd-logind[1548]: Removed session 36. Feb 13 08:25:29.542864 kernel: audit: type=1101 audit(1707812729.363:558): pid=7880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.542915 kernel: audit: type=1103 audit(1707812729.366:559): pid=7880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.542928 kernel: audit: type=1006 audit(1707812729.367:560): pid=7880 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Feb 13 08:25:29.367000 audit[7880]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee398d4c0 a2=3 a3=0 items=0 ppid=1 pid=7880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:29.690342 kernel: audit: type=1300 audit(1707812729.367:560): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee398d4c0 a2=3 a3=0 items=0 ppid=1 pid=7880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:29.690375 kernel: audit: type=1327 audit(1707812729.367:560): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:29.367000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:29.720173 kernel: audit: type=1105 audit(1707812729.374:561): pid=7880 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.374000 audit[7880]: USER_START pid=7880 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.375000 audit[7883]: CRED_ACQ pid=7883 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.902823 kernel: audit: type=1103 audit(1707812729.375:562): pid=7883 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.902863 kernel: audit: type=1106 audit(1707812729.454:563): pid=7880 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.454000 audit[7880]: USER_END pid=7880 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.998138 kernel: audit: type=1104 audit(1707812729.454:564): pid=7880 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.454000 audit[7880]: CRED_DISP pid=7880 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:29.455000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.67.79:22-139.178.68.195:51230 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:34.461803 systemd[1]: Started sshd@36-145.40.67.79:22-139.178.68.195:51242.service. Feb 13 08:25:34.461000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.67.79:22-139.178.68.195:51242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:34.498651 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:34.498762 kernel: audit: type=1130 audit(1707812734.461:566): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.67.79:22-139.178.68.195:51242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:34.607127 sshd[7934]: Accepted publickey for core from 139.178.68.195 port 51242 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:34.606000 audit[7934]: USER_ACCT pid=7934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.609132 sshd[7934]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:34.611339 systemd-logind[1548]: New session 37 of user core. Feb 13 08:25:34.611833 systemd[1]: Started session-37.scope. Feb 13 08:25:34.690455 sshd[7934]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:34.691669 systemd[1]: sshd@36-145.40.67.79:22-139.178.68.195:51242.service: Deactivated successfully. Feb 13 08:25:34.692295 systemd[1]: session-37.scope: Deactivated successfully. Feb 13 08:25:34.692305 systemd-logind[1548]: Session 37 logged out. Waiting for processes to exit. Feb 13 08:25:34.692794 systemd-logind[1548]: Removed session 37. Feb 13 08:25:34.607000 audit[7934]: CRED_ACQ pid=7934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.788606 kernel: audit: type=1101 audit(1707812734.606:567): pid=7934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.788653 kernel: audit: type=1103 audit(1707812734.607:568): pid=7934 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.788675 kernel: audit: type=1006 audit(1707812734.607:569): pid=7934 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Feb 13 08:25:34.607000 audit[7934]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff64de1ec0 a2=3 a3=0 items=0 ppid=1 pid=7934 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:34.938957 kernel: audit: type=1300 audit(1707812734.607:569): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff64de1ec0 a2=3 a3=0 items=0 ppid=1 pid=7934 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:34.939023 kernel: audit: type=1327 audit(1707812734.607:569): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:34.607000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:34.969377 kernel: audit: type=1105 audit(1707812734.612:570): pid=7934 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.612000 audit[7934]: USER_START pid=7934 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:35.063686 kernel: audit: type=1103 audit(1707812734.613:571): pid=7943 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.613000 audit[7943]: CRED_ACQ pid=7943 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:35.152708 kernel: audit: type=1106 audit(1707812734.689:572): pid=7934 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.689000 audit[7934]: USER_END pid=7934 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.689000 audit[7934]: CRED_DISP pid=7934 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:35.337139 kernel: audit: type=1104 audit(1707812734.689:573): pid=7934 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:34.690000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.67.79:22-139.178.68.195:51242 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:39.696882 systemd[1]: Started sshd@37-145.40.67.79:22-139.178.68.195:59342.service. Feb 13 08:25:39.695000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.67.79:22-139.178.68.195:59342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:39.723760 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:39.723837 kernel: audit: type=1130 audit(1707812739.695:575): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.67.79:22-139.178.68.195:59342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:39.833000 audit[7967]: USER_ACCT pid=7967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:39.834231 sshd[7967]: Accepted publickey for core from 139.178.68.195 port 59342 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:39.835365 sshd[7967]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:39.837878 systemd-logind[1548]: New session 38 of user core. Feb 13 08:25:39.838305 systemd[1]: Started session-38.scope. Feb 13 08:25:39.916585 sshd[7967]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:39.917968 systemd[1]: sshd@37-145.40.67.79:22-139.178.68.195:59342.service: Deactivated successfully. Feb 13 08:25:39.918642 systemd[1]: session-38.scope: Deactivated successfully. Feb 13 08:25:39.918642 systemd-logind[1548]: Session 38 logged out. Waiting for processes to exit. Feb 13 08:25:39.919092 systemd-logind[1548]: Removed session 38. Feb 13 08:25:39.834000 audit[7967]: CRED_ACQ pid=7967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:40.015531 kernel: audit: type=1101 audit(1707812739.833:576): pid=7967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:40.015568 kernel: audit: type=1103 audit(1707812739.834:577): pid=7967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:40.015584 kernel: audit: type=1006 audit(1707812739.834:578): pid=7967 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=38 res=1 Feb 13 08:25:40.073971 kernel: audit: type=1300 audit(1707812739.834:578): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff83c23d90 a2=3 a3=0 items=0 ppid=1 pid=7967 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:39.834000 audit[7967]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff83c23d90 a2=3 a3=0 items=0 ppid=1 pid=7967 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:40.165792 kernel: audit: type=1327 audit(1707812739.834:578): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:39.834000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:40.196172 kernel: audit: type=1105 audit(1707812739.839:579): pid=7967 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:39.839000 audit[7967]: USER_START pid=7967 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:40.290572 kernel: audit: type=1103 audit(1707812739.839:580): pid=7970 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:39.839000 audit[7970]: CRED_ACQ pid=7970 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:40.379586 kernel: audit: type=1106 audit(1707812739.916:581): pid=7967 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:39.916000 audit[7967]: USER_END pid=7967 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:40.474923 kernel: audit: type=1104 audit(1707812739.916:582): pid=7967 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:39.916000 audit[7967]: CRED_DISP pid=7967 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:39.917000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.67.79:22-139.178.68.195:59342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:40.650593 env[1563]: time="2024-02-13T08:25:40.650464869Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.680 [WARNING][8008] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0", ResourceVersion:"1684", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c", Pod:"coredns-787d4945fb-tr9dh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0fde6d0cc1b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.680 [INFO][8008] k8s.go 578: Cleaning up netns ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.680 [INFO][8008] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" iface="eth0" netns="" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.680 [INFO][8008] k8s.go 585: Releasing IP address(es) ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.680 [INFO][8008] utils.go 188: Calico CNI releasing IP address ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.690 [INFO][8025] ipam_plugin.go 415: Releasing address using handleID ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.690 [INFO][8025] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.690 [INFO][8025] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.695 [WARNING][8025] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.695 [INFO][8025] ipam_plugin.go 443: Releasing address using workloadID ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.696 [INFO][8025] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.697462 env[1563]: 2024-02-13 08:25:40.696 [INFO][8008] k8s.go 591: Teardown processing complete. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.697879 env[1563]: time="2024-02-13T08:25:40.697483057Z" level=info msg="TearDown network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" successfully" Feb 13 08:25:40.697879 env[1563]: time="2024-02-13T08:25:40.697508965Z" level=info msg="StopPodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" returns successfully" Feb 13 08:25:40.697879 env[1563]: time="2024-02-13T08:25:40.697859331Z" level=info msg="RemovePodSandbox for \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:25:40.697938 env[1563]: time="2024-02-13T08:25:40.697877652Z" level=info msg="Forcibly stopping sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\"" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.717 [WARNING][8052] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"26f9cd17-4f15-4ae3-afc2-cd7ad4cc54c0", ResourceVersion:"1684", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"a4eee9b298ad4e03b5396f2ce26ca13d8c1bd3812329037b32c627aa24d8af3c", Pod:"coredns-787d4945fb-tr9dh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0fde6d0cc1b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.717 [INFO][8052] k8s.go 578: Cleaning up netns ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.717 [INFO][8052] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" iface="eth0" netns="" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.717 [INFO][8052] k8s.go 585: Releasing IP address(es) ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.718 [INFO][8052] utils.go 188: Calico CNI releasing IP address ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.729 [INFO][8066] ipam_plugin.go 415: Releasing address using handleID ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.729 [INFO][8066] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.729 [INFO][8066] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.734 [WARNING][8066] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.734 [INFO][8066] ipam_plugin.go 443: Releasing address using workloadID ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" HandleID="k8s-pod-network.2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--tr9dh-eth0" Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.735 [INFO][8066] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.736955 env[1563]: 2024-02-13 08:25:40.736 [INFO][8052] k8s.go 591: Teardown processing complete. ContainerID="2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41" Feb 13 08:25:40.737495 env[1563]: time="2024-02-13T08:25:40.736981969Z" level=info msg="TearDown network for sandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" successfully" Feb 13 08:25:40.738736 env[1563]: time="2024-02-13T08:25:40.738720418Z" level=info msg="RemovePodSandbox \"2c46718c7e3c15598b4fcefc8a7169d0805cc33414447c85b7ef3b4348ed2a41\" returns successfully" Feb 13 08:25:40.739044 env[1563]: time="2024-02-13T08:25:40.739027221Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.763 [WARNING][8096] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8100728f-8434-43ba-8770-5d3f00e1f18f", ResourceVersion:"1660", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7c77f88967", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4", Pod:"csi-node-driver-mrqvb", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.100.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali29cf82678dc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.763 [INFO][8096] k8s.go 578: Cleaning up netns ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.763 [INFO][8096] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" iface="eth0" netns="" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.763 [INFO][8096] k8s.go 585: Releasing IP address(es) ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.763 [INFO][8096] utils.go 188: Calico CNI releasing IP address ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.779 [INFO][8114] ipam_plugin.go 415: Releasing address using handleID ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.779 [INFO][8114] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.779 [INFO][8114] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.784 [WARNING][8114] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.784 [INFO][8114] ipam_plugin.go 443: Releasing address using workloadID ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.785 [INFO][8114] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.786953 env[1563]: 2024-02-13 08:25:40.786 [INFO][8096] k8s.go 591: Teardown processing complete. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.787507 env[1563]: time="2024-02-13T08:25:40.786975480Z" level=info msg="TearDown network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" successfully" Feb 13 08:25:40.787507 env[1563]: time="2024-02-13T08:25:40.787014858Z" level=info msg="StopPodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" returns successfully" Feb 13 08:25:40.787507 env[1563]: time="2024-02-13T08:25:40.787336052Z" level=info msg="RemovePodSandbox for \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:25:40.787507 env[1563]: time="2024-02-13T08:25:40.787362987Z" level=info msg="Forcibly stopping sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\"" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.813 [WARNING][8143] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8100728f-8434-43ba-8770-5d3f00e1f18f", ResourceVersion:"1660", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7c77f88967", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4", Pod:"csi-node-driver-mrqvb", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.100.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali29cf82678dc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.813 [INFO][8143] k8s.go 578: Cleaning up netns ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.813 [INFO][8143] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" iface="eth0" netns="" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.813 [INFO][8143] k8s.go 585: Releasing IP address(es) ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.813 [INFO][8143] utils.go 188: Calico CNI releasing IP address ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.824 [INFO][8159] ipam_plugin.go 415: Releasing address using handleID ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.824 [INFO][8159] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.824 [INFO][8159] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.828 [WARNING][8159] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.828 [INFO][8159] ipam_plugin.go 443: Releasing address using workloadID ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" HandleID="k8s-pod-network.62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Workload="ci--3510.3.2--a--56b02fc11a-k8s-csi--node--driver--mrqvb-eth0" Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.829 [INFO][8159] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.830315 env[1563]: 2024-02-13 08:25:40.829 [INFO][8143] k8s.go 591: Teardown processing complete. ContainerID="62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f" Feb 13 08:25:40.830755 env[1563]: time="2024-02-13T08:25:40.830334852Z" level=info msg="TearDown network for sandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" successfully" Feb 13 08:25:40.831762 env[1563]: time="2024-02-13T08:25:40.831749560Z" level=info msg="RemovePodSandbox \"62254fe623a34159b1c0227224351c2e2d8fc04d8be286cbaf6936126b96e93f\" returns successfully" Feb 13 08:25:40.832017 env[1563]: time="2024-02-13T08:25:40.832005462Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.849 [WARNING][8184] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"7d5a107e-32fc-46ef-9ba6-381664363494", ResourceVersion:"1644", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35", Pod:"coredns-787d4945fb-vhpxp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali959243e1537", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.849 [INFO][8184] k8s.go 578: Cleaning up netns ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.849 [INFO][8184] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" iface="eth0" netns="" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.849 [INFO][8184] k8s.go 585: Releasing IP address(es) ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.849 [INFO][8184] utils.go 188: Calico CNI releasing IP address ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.861 [INFO][8197] ipam_plugin.go 415: Releasing address using handleID ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.861 [INFO][8197] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.861 [INFO][8197] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.865 [WARNING][8197] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.865 [INFO][8197] ipam_plugin.go 443: Releasing address using workloadID ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.867 [INFO][8197] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.868477 env[1563]: 2024-02-13 08:25:40.867 [INFO][8184] k8s.go 591: Teardown processing complete. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.868989 env[1563]: time="2024-02-13T08:25:40.868477158Z" level=info msg="TearDown network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" successfully" Feb 13 08:25:40.868989 env[1563]: time="2024-02-13T08:25:40.868497659Z" level=info msg="StopPodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" returns successfully" Feb 13 08:25:40.868989 env[1563]: time="2024-02-13T08:25:40.868778596Z" level=info msg="RemovePodSandbox for \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:25:40.868989 env[1563]: time="2024-02-13T08:25:40.868808584Z" level=info msg="Forcibly stopping sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\"" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.889 [WARNING][8224] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0", GenerateName:"coredns-787d4945fb-", Namespace:"kube-system", SelfLink:"", UID:"7d5a107e-32fc-46ef-9ba6-381664363494", ResourceVersion:"1644", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"787d4945fb", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"6a5355d995914c8ecd5d905f54d5115c6ee7504592b59c03bd41b184576e8e35", Pod:"coredns-787d4945fb-vhpxp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali959243e1537", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.889 [INFO][8224] k8s.go 578: Cleaning up netns ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.889 [INFO][8224] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" iface="eth0" netns="" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.889 [INFO][8224] k8s.go 585: Releasing IP address(es) ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.889 [INFO][8224] utils.go 188: Calico CNI releasing IP address ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.900 [INFO][8240] ipam_plugin.go 415: Releasing address using handleID ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.900 [INFO][8240] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.900 [INFO][8240] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.904 [WARNING][8240] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.904 [INFO][8240] ipam_plugin.go 443: Releasing address using workloadID ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" HandleID="k8s-pod-network.5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Workload="ci--3510.3.2--a--56b02fc11a-k8s-coredns--787d4945fb--vhpxp-eth0" Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.905 [INFO][8240] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.907205 env[1563]: 2024-02-13 08:25:40.906 [INFO][8224] k8s.go 591: Teardown processing complete. ContainerID="5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9" Feb 13 08:25:40.907724 env[1563]: time="2024-02-13T08:25:40.907176917Z" level=info msg="TearDown network for sandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" successfully" Feb 13 08:25:40.909362 env[1563]: time="2024-02-13T08:25:40.909344347Z" level=info msg="RemovePodSandbox \"5aa58de9777a8e2839770ed647cf08361dbfca4c59967c0638b9357ddaac5ff9\" returns successfully" Feb 13 08:25:40.909668 env[1563]: time="2024-02-13T08:25:40.909654126Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.930 [WARNING][8275] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0", GenerateName:"calico-kube-controllers-646c8f86fc-", Namespace:"calico-system", SelfLink:"", UID:"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d", ResourceVersion:"1666", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"646c8f86fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def", Pod:"calico-kube-controllers-646c8f86fc-xkf58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0bd90e1344e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.930 [INFO][8275] k8s.go 578: Cleaning up netns ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.930 [INFO][8275] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" iface="eth0" netns="" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.930 [INFO][8275] k8s.go 585: Releasing IP address(es) ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.930 [INFO][8275] utils.go 188: Calico CNI releasing IP address ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.941 [INFO][8288] ipam_plugin.go 415: Releasing address using handleID ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.941 [INFO][8288] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.941 [INFO][8288] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.945 [WARNING][8288] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.946 [INFO][8288] ipam_plugin.go 443: Releasing address using workloadID ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.947 [INFO][8288] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.948607 env[1563]: 2024-02-13 08:25:40.947 [INFO][8275] k8s.go 591: Teardown processing complete. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.949022 env[1563]: time="2024-02-13T08:25:40.948626450Z" level=info msg="TearDown network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" successfully" Feb 13 08:25:40.949022 env[1563]: time="2024-02-13T08:25:40.948647185Z" level=info msg="StopPodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" returns successfully" Feb 13 08:25:40.949022 env[1563]: time="2024-02-13T08:25:40.948923272Z" level=info msg="RemovePodSandbox for \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:25:40.949022 env[1563]: time="2024-02-13T08:25:40.948948507Z" level=info msg="Forcibly stopping sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\"" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.971 [WARNING][8316] k8s.go 542: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0", GenerateName:"calico-kube-controllers-646c8f86fc-", Namespace:"calico-system", SelfLink:"", UID:"d23a8efa-b8cf-42bd-9e8d-8f13dc9b2a4d", ResourceVersion:"1666", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 17, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"646c8f86fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def", Pod:"calico-kube-controllers-646c8f86fc-xkf58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0bd90e1344e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.971 [INFO][8316] k8s.go 578: Cleaning up netns ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.971 [INFO][8316] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" iface="eth0" netns="" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.971 [INFO][8316] k8s.go 585: Releasing IP address(es) ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.971 [INFO][8316] utils.go 188: Calico CNI releasing IP address ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.983 [INFO][8333] ipam_plugin.go 415: Releasing address using handleID ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.983 [INFO][8333] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.983 [INFO][8333] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.988 [WARNING][8333] ipam_plugin.go 432: Asked to release address but it doesn't exist. Ignoring ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.988 [INFO][8333] ipam_plugin.go 443: Releasing address using workloadID ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" HandleID="k8s-pod-network.ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--kube--controllers--646c8f86fc--xkf58-eth0" Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.990 [INFO][8333] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:40.991499 env[1563]: 2024-02-13 08:25:40.990 [INFO][8316] k8s.go 591: Teardown processing complete. ContainerID="ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb" Feb 13 08:25:40.991916 env[1563]: time="2024-02-13T08:25:40.991512216Z" level=info msg="TearDown network for sandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" successfully" Feb 13 08:25:40.993093 env[1563]: time="2024-02-13T08:25:40.993044695Z" level=info msg="RemovePodSandbox \"ed9ab9dae0faf9705e71d31e509828e13412e9e09c2561851507a3ea34923feb\" returns successfully" Feb 13 08:25:41.431837 env[1563]: time="2024-02-13T08:25:41.431785671Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:41.432465 env[1563]: time="2024-02-13T08:25:41.432425096Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:91c1c91da7602f16686c149419195b486669f3a1828fd320cf332fdc6a25297d,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:41.433168 env[1563]: time="2024-02-13T08:25:41.433117774Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/csi:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:41.434217 env[1563]: time="2024-02-13T08:25:41.434165395Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi@sha256:2b9021393c17e87ba8a3c89f5b3719941812f4e4751caa0b71eb2233bff48738,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:41.434481 env[1563]: time="2024-02-13T08:25:41.434446231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.27.0\" returns image reference \"sha256:91c1c91da7602f16686c149419195b486669f3a1828fd320cf332fdc6a25297d\"" Feb 13 08:25:41.434888 env[1563]: time="2024-02-13T08:25:41.434876849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.27.0\"" Feb 13 08:25:41.435556 env[1563]: time="2024-02-13T08:25:41.435543956Z" level=info msg="CreateContainer within sandbox \"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 08:25:41.440514 env[1563]: time="2024-02-13T08:25:41.440470796Z" level=info msg="CreateContainer within sandbox \"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a50752fd98f058bd1dd03e497f470cdff6b2bf86778edfc21d949c2d6a514dc4\"" Feb 13 08:25:41.440922 env[1563]: time="2024-02-13T08:25:41.440865859Z" level=info msg="StartContainer for \"a50752fd98f058bd1dd03e497f470cdff6b2bf86778edfc21d949c2d6a514dc4\"" Feb 13 08:25:41.465122 env[1563]: time="2024-02-13T08:25:41.465096164Z" level=info msg="StartContainer for \"a50752fd98f058bd1dd03e497f470cdff6b2bf86778edfc21d949c2d6a514dc4\" returns successfully" Feb 13 08:25:44.923450 systemd[1]: Started sshd@38-145.40.67.79:22-139.178.68.195:59348.service. Feb 13 08:25:44.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.67.79:22-139.178.68.195:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:44.949962 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:44.950042 kernel: audit: type=1130 audit(1707812744.922:584): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.67.79:22-139.178.68.195:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:45.059000 audit[8385]: USER_ACCT pid=8385 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.060257 sshd[8385]: Accepted publickey for core from 139.178.68.195 port 59348 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:45.063273 sshd[8385]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:45.065606 systemd-logind[1548]: New session 39 of user core. Feb 13 08:25:45.066349 systemd[1]: Started session-39.scope. Feb 13 08:25:45.145940 sshd[8385]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:45.147222 systemd[1]: sshd@38-145.40.67.79:22-139.178.68.195:59348.service: Deactivated successfully. Feb 13 08:25:45.147827 systemd-logind[1548]: Session 39 logged out. Waiting for processes to exit. Feb 13 08:25:45.147867 systemd[1]: session-39.scope: Deactivated successfully. Feb 13 08:25:45.148312 systemd-logind[1548]: Removed session 39. Feb 13 08:25:45.062000 audit[8385]: CRED_ACQ pid=8385 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.242394 kernel: audit: type=1101 audit(1707812745.059:585): pid=8385 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.242432 kernel: audit: type=1103 audit(1707812745.062:586): pid=8385 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.242449 kernel: audit: type=1006 audit(1707812745.062:587): pid=8385 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Feb 13 08:25:45.300819 kernel: audit: type=1300 audit(1707812745.062:587): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc66f7660 a2=3 a3=0 items=0 ppid=1 pid=8385 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:45.062000 audit[8385]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc66f7660 a2=3 a3=0 items=0 ppid=1 pid=8385 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:45.392723 kernel: audit: type=1327 audit(1707812745.062:587): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:45.062000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:45.423128 kernel: audit: type=1105 audit(1707812745.067:588): pid=8385 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.067000 audit[8385]: USER_START pid=8385 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.517480 kernel: audit: type=1103 audit(1707812745.068:589): pid=8388 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.068000 audit[8388]: CRED_ACQ pid=8388 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.606633 kernel: audit: type=1106 audit(1707812745.145:590): pid=8385 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.145000 audit[8385]: USER_END pid=8385 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.701994 kernel: audit: type=1104 audit(1707812745.145:591): pid=8385 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.145000 audit[8385]: CRED_DISP pid=8385 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:45.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.67.79:22-139.178.68.195:59348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:50.154262 systemd[1]: Started sshd@39-145.40.67.79:22-139.178.68.195:56186.service. Feb 13 08:25:50.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.67.79:22-139.178.68.195:56186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:50.181515 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:25:50.181600 kernel: audit: type=1130 audit(1707812750.153:593): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.67.79:22-139.178.68.195:56186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:50.290000 audit[8412]: USER_ACCT pid=8412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.291462 sshd[8412]: Accepted publickey for core from 139.178.68.195 port 56186 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:50.294295 sshd[8412]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:50.296717 systemd-logind[1548]: New session 40 of user core. Feb 13 08:25:50.297276 systemd[1]: Started session-40.scope. Feb 13 08:25:50.376790 sshd[8412]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:50.378232 systemd[1]: sshd@39-145.40.67.79:22-139.178.68.195:56186.service: Deactivated successfully. Feb 13 08:25:50.378878 systemd-logind[1548]: Session 40 logged out. Waiting for processes to exit. Feb 13 08:25:50.378914 systemd[1]: session-40.scope: Deactivated successfully. Feb 13 08:25:50.379481 systemd-logind[1548]: Removed session 40. Feb 13 08:25:50.293000 audit[8412]: CRED_ACQ pid=8412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.474232 kernel: audit: type=1101 audit(1707812750.290:594): pid=8412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.474285 kernel: audit: type=1103 audit(1707812750.293:595): pid=8412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.474299 kernel: audit: type=1006 audit(1707812750.293:596): pid=8412 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Feb 13 08:25:50.293000 audit[8412]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef5484210 a2=3 a3=0 items=0 ppid=1 pid=8412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:50.624658 kernel: audit: type=1300 audit(1707812750.293:596): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef5484210 a2=3 a3=0 items=0 ppid=1 pid=8412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:50.624717 kernel: audit: type=1327 audit(1707812750.293:596): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:50.624739 kernel: audit: type=1105 audit(1707812750.298:597): pid=8412 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.293000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:50.298000 audit[8412]: USER_START pid=8412 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.299000 audit[8415]: CRED_ACQ pid=8415 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.837809 env[1563]: time="2024-02-13T08:25:50.837734514Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:50.838406 kernel: audit: type=1103 audit(1707812750.299:598): pid=8415 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.838445 kernel: audit: type=1106 audit(1707812750.376:599): pid=8412 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.376000 audit[8412]: USER_END pid=8412 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.838499 env[1563]: time="2024-02-13T08:25:50.838451541Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4e87edec0297dadd6f3bb25b2f540fd40e2abed9fff582c97ff4cd751d3f9803,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:50.839517 env[1563]: time="2024-02-13T08:25:50.839473095Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:50.840478 env[1563]: time="2024-02-13T08:25:50.840437440Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers@sha256:e264ab1fb2f1ae90dd1d84e226d11d2eb4350e74ac27de4c65f29f5aadba5bb1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:25:50.841352 env[1563]: time="2024-02-13T08:25:50.841309210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.27.0\" returns image reference \"sha256:4e87edec0297dadd6f3bb25b2f540fd40e2abed9fff582c97ff4cd751d3f9803\"" Feb 13 08:25:50.841548 env[1563]: time="2024-02-13T08:25:50.841533807Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.27.0\"" Feb 13 08:25:50.844699 env[1563]: time="2024-02-13T08:25:50.844679533Z" level=info msg="CreateContainer within sandbox \"d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 08:25:50.376000 audit[8412]: CRED_DISP pid=8412 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:51.022806 kernel: audit: type=1104 audit(1707812750.376:600): pid=8412 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:50.377000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.67.79:22-139.178.68.195:56186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:51.024388 env[1563]: time="2024-02-13T08:25:51.024341597Z" level=info msg="CreateContainer within sandbox \"d3cb075b9025b110df7355fd8712c1ed3c3ea003f8d5b1718d695bd987707def\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1a557e8798e3ae44ad504ea3878bf9064c10b52abec8f9eee09a3d2c56b8d658\"" Feb 13 08:25:51.024580 env[1563]: time="2024-02-13T08:25:51.024550409Z" level=info msg="StartContainer for \"1a557e8798e3ae44ad504ea3878bf9064c10b52abec8f9eee09a3d2c56b8d658\"" Feb 13 08:25:51.056183 env[1563]: time="2024-02-13T08:25:51.056121722Z" level=info msg="StartContainer for \"1a557e8798e3ae44ad504ea3878bf9064c10b52abec8f9eee09a3d2c56b8d658\" returns successfully" Feb 13 08:25:51.950183 kubelet[2738]: I0213 08:25:51.950105 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-646c8f86fc-xkf58" podStartSLOduration=-9.223371563904806e+09 pod.CreationTimestamp="2024-02-13 08:17:59 +0000 UTC" firstStartedPulling="2024-02-13 08:25:19.817458655 +0000 UTC m=+459.344367722" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:25:51.949316139 +0000 UTC m=+491.476225292" watchObservedRunningTime="2024-02-13 08:25:51.949970591 +0000 UTC m=+491.476879700" Feb 13 08:25:54.178646 kubelet[2738]: I0213 08:25:54.178610 2738 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:25:54.201000 audit[8525]: NETFILTER_CFG table=filter:127 family=2 entries=7 op=nft_register_rule pid=8525 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:54.201000 audit[8525]: SYSCALL arch=c000003e syscall=46 success=yes exit=2620 a0=3 a1=7ffd90459d30 a2=0 a3=7ffd90459d1c items=0 ppid=3000 pid=8525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:54.201000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:54.202000 audit[8525]: NETFILTER_CFG table=nat:128 family=2 entries=78 op=nft_register_rule pid=8525 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:54.202000 audit[8525]: SYSCALL arch=c000003e syscall=46 success=yes exit=24988 a0=3 a1=7ffd90459d30 a2=0 a3=7ffd90459d1c items=0 ppid=3000 pid=8525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:54.202000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:54.243000 audit[8551]: NETFILTER_CFG table=filter:129 family=2 entries=8 op=nft_register_rule pid=8551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:54.243000 audit[8551]: SYSCALL arch=c000003e syscall=46 success=yes exit=2620 a0=3 a1=7ffc6d010920 a2=0 a3=7ffc6d01090c items=0 ppid=3000 pid=8551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:54.243000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:54.244000 audit[8551]: NETFILTER_CFG table=nat:130 family=2 entries=78 op=nft_register_rule pid=8551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:25:54.244000 audit[8551]: SYSCALL arch=c000003e syscall=46 success=yes exit=24988 a0=3 a1=7ffc6d010920 a2=0 a3=7ffc6d01090c items=0 ppid=3000 pid=8551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:54.244000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:25:54.337499 kubelet[2738]: I0213 08:25:54.337393 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8043ab31-8db8-4cf0-bf00-6b4c63970e02-calico-apiserver-certs\") pod \"calico-apiserver-bf6ccd78c-n22rv\" (UID: \"8043ab31-8db8-4cf0-bf00-6b4c63970e02\") " pod="calico-apiserver/calico-apiserver-bf6ccd78c-n22rv" Feb 13 08:25:54.337799 kubelet[2738]: I0213 08:25:54.337588 2738 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jtmw\" (UniqueName: \"kubernetes.io/projected/8043ab31-8db8-4cf0-bf00-6b4c63970e02-kube-api-access-5jtmw\") pod \"calico-apiserver-bf6ccd78c-n22rv\" (UID: \"8043ab31-8db8-4cf0-bf00-6b4c63970e02\") " pod="calico-apiserver/calico-apiserver-bf6ccd78c-n22rv" Feb 13 08:25:54.439372 kubelet[2738]: E0213 08:25:54.439169 2738 secret.go:194] Couldn't get secret calico-apiserver/calico-apiserver-certs: secret "calico-apiserver-certs" not found Feb 13 08:25:54.439372 kubelet[2738]: E0213 08:25:54.439347 2738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8043ab31-8db8-4cf0-bf00-6b4c63970e02-calico-apiserver-certs podName:8043ab31-8db8-4cf0-bf00-6b4c63970e02 nodeName:}" failed. No retries permitted until 2024-02-13 08:25:54.939296756 +0000 UTC m=+494.466205867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/8043ab31-8db8-4cf0-bf00-6b4c63970e02-calico-apiserver-certs") pod "calico-apiserver-bf6ccd78c-n22rv" (UID: "8043ab31-8db8-4cf0-bf00-6b4c63970e02") : secret "calico-apiserver-certs" not found Feb 13 08:25:55.083681 env[1563]: time="2024-02-13T08:25:55.083554240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bf6ccd78c-n22rv,Uid:8043ab31-8db8-4cf0-bf00-6b4c63970e02,Namespace:calico-apiserver,Attempt:0,}" Feb 13 08:25:55.236428 systemd-networkd[1419]: cali9bf6edd3f03: Link UP Feb 13 08:25:55.291976 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Feb 13 08:25:55.292069 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali9bf6edd3f03: link becomes ready Feb 13 08:25:55.292106 systemd-networkd[1419]: cali9bf6edd3f03: Gained carrier Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.167 [INFO][8582] plugin.go 327: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0 calico-apiserver-bf6ccd78c- calico-apiserver 8043ab31-8db8-4cf0-bf00-6b4c63970e02 1810 0 2024-02-13 08:25:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bf6ccd78c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-3510.3.2-a-56b02fc11a calico-apiserver-bf6ccd78c-n22rv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9bf6edd3f03 [] []}} ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.168 [INFO][8582] k8s.go 76: Extracted identifiers for CmdAddK8s ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.198 [INFO][8605] ipam_plugin.go 228: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" HandleID="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.208 [INFO][8605] ipam_plugin.go 268: Auto assigning IP ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" HandleID="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033d870), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-3510.3.2-a-56b02fc11a", "pod":"calico-apiserver-bf6ccd78c-n22rv", "timestamp":"2024-02-13 08:25:55.198292316 +0000 UTC"}, Hostname:"ci-3510.3.2-a-56b02fc11a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.208 [INFO][8605] ipam_plugin.go 356: About to acquire host-wide IPAM lock. Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.208 [INFO][8605] ipam_plugin.go 371: Acquired host-wide IPAM lock. Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.208 [INFO][8605] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.2-a-56b02fc11a' Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.210 [INFO][8605] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.214 [INFO][8605] ipam.go 372: Looking up existing affinities for host host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.218 [INFO][8605] ipam.go 489: Trying affinity for 192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.220 [INFO][8605] ipam.go 155: Attempting to load block cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.223 [INFO][8605] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.100.128/26 host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.223 [INFO][8605] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.100.128/26 handle="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.224 [INFO][8605] ipam.go 1682: Creating new handle: k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4 Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.228 [INFO][8605] ipam.go 1203: Writing block in order to claim IPs block=192.168.100.128/26 handle="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.233 [INFO][8605] ipam.go 1216: Successfully claimed IPs: [192.168.100.133/26] block=192.168.100.128/26 handle="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.233 [INFO][8605] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.100.133/26] handle="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" host="ci-3510.3.2-a-56b02fc11a" Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.233 [INFO][8605] ipam_plugin.go 377: Released host-wide IPAM lock. Feb 13 08:25:55.297854 env[1563]: 2024-02-13 08:25:55.233 [INFO][8605] ipam_plugin.go 286: Calico CNI IPAM assigned addresses IPv4=[192.168.100.133/26] IPv6=[] ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" HandleID="k8s-pod-network.ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Workload="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.298363 env[1563]: 2024-02-13 08:25:55.235 [INFO][8582] k8s.go 385: Populated endpoint ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0", GenerateName:"calico-apiserver-bf6ccd78c-", Namespace:"calico-apiserver", SelfLink:"", UID:"8043ab31-8db8-4cf0-bf00-6b4c63970e02", ResourceVersion:"1810", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 25, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bf6ccd78c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"", Pod:"calico-apiserver-bf6ccd78c-n22rv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9bf6edd3f03", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:55.298363 env[1563]: 2024-02-13 08:25:55.235 [INFO][8582] k8s.go 386: Calico CNI using IPs: [192.168.100.133/32] ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.298363 env[1563]: 2024-02-13 08:25:55.235 [INFO][8582] dataplane_linux.go 68: Setting the host side veth name to cali9bf6edd3f03 ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.298363 env[1563]: 2024-02-13 08:25:55.292 [INFO][8582] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.298363 env[1563]: 2024-02-13 08:25:55.292 [INFO][8582] k8s.go 413: Added Mac, interface name, and active container ID to endpoint ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0", GenerateName:"calico-apiserver-bf6ccd78c-", Namespace:"calico-apiserver", SelfLink:"", UID:"8043ab31-8db8-4cf0-bf00-6b4c63970e02", ResourceVersion:"1810", Generation:0, CreationTimestamp:time.Date(2024, time.February, 13, 8, 25, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bf6ccd78c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.2-a-56b02fc11a", ContainerID:"ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4", Pod:"calico-apiserver-bf6ccd78c-n22rv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9bf6edd3f03", MAC:"ea:de:c0:10:3e:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 08:25:55.298363 env[1563]: 2024-02-13 08:25:55.296 [INFO][8582] k8s.go 491: Wrote updated endpoint to datastore ContainerID="ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4" Namespace="calico-apiserver" Pod="calico-apiserver-bf6ccd78c-n22rv" WorkloadEndpoint="ci--3510.3.2--a--56b02fc11a-k8s-calico--apiserver--bf6ccd78c--n22rv-eth0" Feb 13 08:25:55.303316 env[1563]: time="2024-02-13T08:25:55.303251721Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:25:55.303316 env[1563]: time="2024-02-13T08:25:55.303272918Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:25:55.303316 env[1563]: time="2024-02-13T08:25:55.303279981Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:25:55.303443 env[1563]: time="2024-02-13T08:25:55.303345554Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4 pid=8650 runtime=io.containerd.runc.v2 Feb 13 08:25:55.305000 audit[8663]: NETFILTER_CFG table=filter:131 family=2 entries=61 op=nft_register_chain pid=8663 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:55.332399 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:25:55.332447 kernel: audit: type=1325 audit(1707812755.305:606): table=filter:131 family=2 entries=61 op=nft_register_chain pid=8663 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Feb 13 08:25:55.379249 systemd[1]: Started sshd@40-145.40.67.79:22-139.178.68.195:56202.service. Feb 13 08:25:55.305000 audit[8663]: SYSCALL arch=c000003e syscall=46 success=yes exit=30940 a0=3 a1=7ffd9bc1eb70 a2=0 a3=7ffd9bc1eb5c items=0 ppid=6759 pid=8663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:55.389997 kernel: audit: type=1300 audit(1707812755.305:606): arch=c000003e syscall=46 success=yes exit=30940 a0=3 a1=7ffd9bc1eb70 a2=0 a3=7ffd9bc1eb5c items=0 ppid=6759 pid=8663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:55.305000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:55.505321 sshd[8672]: Accepted publickey for core from 139.178.68.195 port 56202 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:25:55.509292 sshd[8672]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:25:55.511644 systemd-logind[1548]: New session 41 of user core. Feb 13 08:25:55.512338 systemd[1]: Started session-41.scope. Feb 13 08:25:55.545410 kernel: audit: type=1327 audit(1707812755.305:606): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Feb 13 08:25:55.545457 kernel: audit: type=1130 audit(1707812755.378:607): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.67.79:22-139.178.68.195:56202 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:55.378000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.67.79:22-139.178.68.195:56202 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:55.592140 sshd[8672]: pam_unix(sshd:session): session closed for user core Feb 13 08:25:55.593611 systemd[1]: sshd@40-145.40.67.79:22-139.178.68.195:56202.service: Deactivated successfully. Feb 13 08:25:55.594226 systemd-logind[1548]: Session 41 logged out. Waiting for processes to exit. Feb 13 08:25:55.594235 systemd[1]: session-41.scope: Deactivated successfully. Feb 13 08:25:55.594741 systemd-logind[1548]: Removed session 41. Feb 13 08:25:55.632376 kernel: audit: type=1101 audit(1707812755.503:608): pid=8672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.503000 audit[8672]: USER_ACCT pid=8672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.722920 kernel: audit: type=1103 audit(1707812755.508:609): pid=8672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.508000 audit[8672]: CRED_ACQ pid=8672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.811856 kernel: audit: type=1006 audit(1707812755.508:610): pid=8672 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Feb 13 08:25:55.869910 kernel: audit: type=1300 audit(1707812755.508:610): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7d2dab50 a2=3 a3=0 items=0 ppid=1 pid=8672 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:55.508000 audit[8672]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7d2dab50 a2=3 a3=0 items=0 ppid=1 pid=8672 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:25:55.875036 env[1563]: time="2024-02-13T08:25:55.875014747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bf6ccd78c-n22rv,Uid:8043ab31-8db8-4cf0-bf00-6b4c63970e02,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4\"" Feb 13 08:25:55.961226 kernel: audit: type=1327 audit(1707812755.508:610): proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:55.508000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:25:55.991481 kernel: audit: type=1105 audit(1707812755.513:611): pid=8672 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.513000 audit[8672]: USER_START pid=8672 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.513000 audit[8684]: CRED_ACQ pid=8684 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.591000 audit[8672]: USER_END pid=8672 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.591000 audit[8672]: CRED_DISP pid=8672 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:25:55.591000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.67.79:22-139.178.68.195:56202 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:25:56.754360 systemd-networkd[1419]: cali9bf6edd3f03: Gained IPv6LL Feb 13 08:26:00.598736 systemd[1]: Started sshd@41-145.40.67.79:22-139.178.68.195:60544.service. Feb 13 08:26:00.597000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.67.79:22-139.178.68.195:60544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:00.625617 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:26:00.625667 kernel: audit: type=1130 audit(1707812760.597:616): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.67.79:22-139.178.68.195:60544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:00.734000 audit[8717]: USER_ACCT pid=8717 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.735232 sshd[8717]: Accepted publickey for core from 139.178.68.195 port 60544 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:00.736653 sshd[8717]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:00.739402 systemd-logind[1548]: New session 42 of user core. Feb 13 08:26:00.739817 systemd[1]: Started session-42.scope. Feb 13 08:26:00.816273 sshd[8717]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:00.817722 systemd[1]: sshd@41-145.40.67.79:22-139.178.68.195:60544.service: Deactivated successfully. Feb 13 08:26:00.818376 systemd[1]: session-42.scope: Deactivated successfully. Feb 13 08:26:00.818418 systemd-logind[1548]: Session 42 logged out. Waiting for processes to exit. Feb 13 08:26:00.818914 systemd-logind[1548]: Removed session 42. Feb 13 08:26:00.735000 audit[8717]: CRED_ACQ pid=8717 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.914642 kernel: audit: type=1101 audit(1707812760.734:617): pid=8717 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.914674 kernel: audit: type=1103 audit(1707812760.735:618): pid=8717 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.914687 kernel: audit: type=1006 audit(1707812760.735:619): pid=8717 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Feb 13 08:26:00.972401 kernel: audit: type=1300 audit(1707812760.735:619): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3725ecb0 a2=3 a3=0 items=0 ppid=1 pid=8717 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:00.735000 audit[8717]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3725ecb0 a2=3 a3=0 items=0 ppid=1 pid=8717 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:00.735000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:01.093307 kernel: audit: type=1327 audit(1707812760.735:619): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:00.741000 audit[8717]: USER_START pid=8717 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:01.186760 kernel: audit: type=1105 audit(1707812760.741:620): pid=8717 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:01.186786 kernel: audit: type=1103 audit(1707812760.741:621): pid=8720 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.741000 audit[8720]: CRED_ACQ pid=8720 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:01.275419 kernel: audit: type=1106 audit(1707812760.815:622): pid=8717 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.815000 audit[8717]: USER_END pid=8717 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:01.371209 kernel: audit: type=1104 audit(1707812760.816:623): pid=8717 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.816000 audit[8717]: CRED_DISP pid=8717 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:00.816000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.67.79:22-139.178.68.195:60544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:05.824971 systemd[1]: Started sshd@42-145.40.67.79:22-139.178.68.195:60554.service. Feb 13 08:26:05.824000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.67.79:22-139.178.68.195:60554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:05.861745 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:05.861827 kernel: audit: type=1130 audit(1707812765.824:625): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.67.79:22-139.178.68.195:60554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:05.969000 audit[8772]: USER_ACCT pid=8772 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:05.970293 sshd[8772]: Accepted publickey for core from 139.178.68.195 port 60554 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:05.971280 sshd[8772]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:05.973646 systemd-logind[1548]: New session 43 of user core. Feb 13 08:26:05.974177 systemd[1]: Started session-43.scope. Feb 13 08:26:06.053378 sshd[8772]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:06.054759 systemd[1]: sshd@42-145.40.67.79:22-139.178.68.195:60554.service: Deactivated successfully. Feb 13 08:26:06.055414 systemd[1]: session-43.scope: Deactivated successfully. Feb 13 08:26:06.055456 systemd-logind[1548]: Session 43 logged out. Waiting for processes to exit. Feb 13 08:26:06.055965 systemd-logind[1548]: Removed session 43. Feb 13 08:26:05.970000 audit[8772]: CRED_ACQ pid=8772 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.151640 kernel: audit: type=1101 audit(1707812765.969:626): pid=8772 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.151671 kernel: audit: type=1103 audit(1707812765.970:627): pid=8772 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.151684 kernel: audit: type=1006 audit(1707812765.970:628): pid=8772 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Feb 13 08:26:06.209814 kernel: audit: type=1300 audit(1707812765.970:628): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed1c37a10 a2=3 a3=0 items=0 ppid=1 pid=8772 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:05.970000 audit[8772]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed1c37a10 a2=3 a3=0 items=0 ppid=1 pid=8772 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:06.301323 kernel: audit: type=1327 audit(1707812765.970:628): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:05.970000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:06.331714 kernel: audit: type=1105 audit(1707812765.975:629): pid=8772 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:05.975000 audit[8772]: USER_START pid=8772 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.426528 kernel: audit: type=1103 audit(1707812765.976:630): pid=8775 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:05.976000 audit[8775]: CRED_ACQ pid=8775 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.515604 kernel: audit: type=1106 audit(1707812766.053:631): pid=8772 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.053000 audit[8772]: USER_END pid=8772 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.610916 kernel: audit: type=1104 audit(1707812766.053:632): pid=8772 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.053000 audit[8772]: CRED_DISP pid=8772 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:06.053000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.67.79:22-139.178.68.195:60554 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:11.061002 systemd[1]: Started sshd@43-145.40.67.79:22-139.178.68.195:57514.service. Feb 13 08:26:11.060000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.67.79:22-139.178.68.195:57514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:11.087811 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:11.087864 kernel: audit: type=1130 audit(1707812771.060:634): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.67.79:22-139.178.68.195:57514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:11.197000 audit[8798]: USER_ACCT pid=8798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.198154 sshd[8798]: Accepted publickey for core from 139.178.68.195 port 57514 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:11.199324 sshd[8798]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:11.201730 systemd-logind[1548]: New session 44 of user core. Feb 13 08:26:11.202464 systemd[1]: Started session-44.scope. Feb 13 08:26:11.279874 sshd[8798]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:11.281210 systemd[1]: sshd@43-145.40.67.79:22-139.178.68.195:57514.service: Deactivated successfully. Feb 13 08:26:11.281814 systemd-logind[1548]: Session 44 logged out. Waiting for processes to exit. Feb 13 08:26:11.281831 systemd[1]: session-44.scope: Deactivated successfully. Feb 13 08:26:11.282387 systemd-logind[1548]: Removed session 44. Feb 13 08:26:11.198000 audit[8798]: CRED_ACQ pid=8798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.379541 kernel: audit: type=1101 audit(1707812771.197:635): pid=8798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.379572 kernel: audit: type=1103 audit(1707812771.198:636): pid=8798 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.379590 kernel: audit: type=1006 audit(1707812771.198:637): pid=8798 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Feb 13 08:26:11.437979 kernel: audit: type=1300 audit(1707812771.198:637): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff335dcf10 a2=3 a3=0 items=0 ppid=1 pid=8798 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:11.198000 audit[8798]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff335dcf10 a2=3 a3=0 items=0 ppid=1 pid=8798 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:11.529791 kernel: audit: type=1327 audit(1707812771.198:637): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:11.198000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:11.560205 kernel: audit: type=1105 audit(1707812771.203:638): pid=8798 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.203000 audit[8798]: USER_START pid=8798 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.654516 kernel: audit: type=1103 audit(1707812771.204:639): pid=8801 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.204000 audit[8801]: CRED_ACQ pid=8801 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.279000 audit[8798]: USER_END pid=8798 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.838932 kernel: audit: type=1106 audit(1707812771.279:640): pid=8798 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.838967 kernel: audit: type=1104 audit(1707812771.279:641): pid=8798 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.279000 audit[8798]: CRED_DISP pid=8798 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:11.280000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.67.79:22-139.178.68.195:57514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:12.390027 env[1563]: time="2024-02-13T08:26:12.389973297Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:12.391013 env[1563]: time="2024-02-13T08:26:12.390987591Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:d36ef67f7b24c4facd86d0bc06b0cd907431a822dee695eb06b86a905bff85d4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:12.392174 env[1563]: time="2024-02-13T08:26:12.392162284Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:12.393197 env[1563]: time="2024-02-13T08:26:12.393148450Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar@sha256:45a7aba6020a7cf7b866cb8a8d481b30c97e9b3407e1459aaa65a5b4cc06633a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:12.393428 env[1563]: time="2024-02-13T08:26:12.393386867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.27.0\" returns image reference \"sha256:d36ef67f7b24c4facd86d0bc06b0cd907431a822dee695eb06b86a905bff85d4\"" Feb 13 08:26:12.393792 env[1563]: time="2024-02-13T08:26:12.393748733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.27.0\"" Feb 13 08:26:12.394626 env[1563]: time="2024-02-13T08:26:12.394589569Z" level=info msg="CreateContainer within sandbox \"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 08:26:12.399363 env[1563]: time="2024-02-13T08:26:12.399346286Z" level=info msg="CreateContainer within sandbox \"e09c977d0a21b0e5a4567c9192cf47e1a2da6d7606d607cf20d10be754fa85d4\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"0a52fc9b6bb844bb68e5e3b395a97135a17d41fefecee87aa104f190a336043d\"" Feb 13 08:26:12.399642 env[1563]: time="2024-02-13T08:26:12.399583914Z" level=info msg="StartContainer for \"0a52fc9b6bb844bb68e5e3b395a97135a17d41fefecee87aa104f190a336043d\"" Feb 13 08:26:12.423388 env[1563]: time="2024-02-13T08:26:12.423363976Z" level=info msg="StartContainer for \"0a52fc9b6bb844bb68e5e3b395a97135a17d41fefecee87aa104f190a336043d\" returns successfully" Feb 13 08:26:12.983123 kubelet[2738]: I0213 08:26:12.983037 2738 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 08:26:12.983123 kubelet[2738]: I0213 08:26:12.983119 2738 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 08:26:13.001854 kubelet[2738]: I0213 08:26:13.001835 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-mrqvb" podStartSLOduration=-9.22337154285297e+09 pod.CreationTimestamp="2024-02-13 08:17:59 +0000 UTC" firstStartedPulling="2024-02-13 08:25:19.359901534 +0000 UTC m=+458.886810599" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:26:13.001681311 +0000 UTC m=+512.528590378" watchObservedRunningTime="2024-02-13 08:26:13.001805676 +0000 UTC m=+512.528714742" Feb 13 08:26:16.289546 systemd[1]: Started sshd@44-145.40.67.79:22-139.178.68.195:55736.service. Feb 13 08:26:16.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.67.79:22-139.178.68.195:55736 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:16.317439 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:16.317486 kernel: audit: type=1130 audit(1707812776.289:643): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.67.79:22-139.178.68.195:55736 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:16.427000 audit[8865]: USER_ACCT pid=8865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.428165 sshd[8865]: Accepted publickey for core from 139.178.68.195 port 55736 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:16.429241 sshd[8865]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:16.431660 systemd-logind[1548]: New session 45 of user core. Feb 13 08:26:16.432314 systemd[1]: Started session-45.scope. Feb 13 08:26:16.509890 sshd[8865]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:16.511400 systemd[1]: sshd@44-145.40.67.79:22-139.178.68.195:55736.service: Deactivated successfully. Feb 13 08:26:16.511937 systemd-logind[1548]: Session 45 logged out. Waiting for processes to exit. Feb 13 08:26:16.511973 systemd[1]: session-45.scope: Deactivated successfully. Feb 13 08:26:16.512499 systemd-logind[1548]: Removed session 45. Feb 13 08:26:16.520058 kernel: audit: type=1101 audit(1707812776.427:644): pid=8865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.520095 kernel: audit: type=1103 audit(1707812776.428:645): pid=8865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.428000 audit[8865]: CRED_ACQ pid=8865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.668890 kernel: audit: type=1006 audit(1707812776.428:646): pid=8865 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=45 res=1 Feb 13 08:26:16.668930 kernel: audit: type=1300 audit(1707812776.428:646): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed80f0510 a2=3 a3=0 items=0 ppid=1 pid=8865 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:16.428000 audit[8865]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffed80f0510 a2=3 a3=0 items=0 ppid=1 pid=8865 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:16.760747 kernel: audit: type=1327 audit(1707812776.428:646): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:16.428000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:16.791141 kernel: audit: type=1105 audit(1707812776.433:647): pid=8865 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.433000 audit[8865]: USER_START pid=8865 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.434000 audit[8868]: CRED_ACQ pid=8868 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.974541 kernel: audit: type=1103 audit(1707812776.434:648): pid=8868 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.974575 kernel: audit: type=1106 audit(1707812776.509:649): pid=8865 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.509000 audit[8865]: USER_END pid=8865 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:17.070081 kernel: audit: type=1104 audit(1707812776.509:650): pid=8865 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.509000 audit[8865]: CRED_DISP pid=8865 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:16.510000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.67.79:22-139.178.68.195:55736 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:21.516812 systemd[1]: Started sshd@45-145.40.67.79:22-139.178.68.195:55738.service. Feb 13 08:26:21.515000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.67.79:22-139.178.68.195:55738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:21.543753 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:21.543795 kernel: audit: type=1130 audit(1707812781.515:652): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.67.79:22-139.178.68.195:55738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:21.652000 audit[8892]: USER_ACCT pid=8892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.653859 sshd[8892]: Accepted publickey for core from 139.178.68.195 port 55738 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:21.655307 sshd[8892]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:21.657721 systemd-logind[1548]: New session 46 of user core. Feb 13 08:26:21.658172 systemd[1]: Started session-46.scope. Feb 13 08:26:21.736401 sshd[8892]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:21.737662 systemd[1]: sshd@45-145.40.67.79:22-139.178.68.195:55738.service: Deactivated successfully. Feb 13 08:26:21.738258 systemd-logind[1548]: Session 46 logged out. Waiting for processes to exit. Feb 13 08:26:21.738298 systemd[1]: session-46.scope: Deactivated successfully. Feb 13 08:26:21.738813 systemd-logind[1548]: Removed session 46. Feb 13 08:26:21.654000 audit[8892]: CRED_ACQ pid=8892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.835861 kernel: audit: type=1101 audit(1707812781.652:653): pid=8892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.835902 kernel: audit: type=1103 audit(1707812781.654:654): pid=8892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.835915 kernel: audit: type=1006 audit(1707812781.654:655): pid=8892 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=46 res=1 Feb 13 08:26:21.894296 kernel: audit: type=1300 audit(1707812781.654:655): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd31e87a0 a2=3 a3=0 items=0 ppid=1 pid=8892 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:21.654000 audit[8892]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd31e87a0 a2=3 a3=0 items=0 ppid=1 pid=8892 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:21.654000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:22.016465 kernel: audit: type=1327 audit(1707812781.654:655): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:22.016509 kernel: audit: type=1105 audit(1707812781.659:656): pid=8892 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.659000 audit[8892]: USER_START pid=8892 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.660000 audit[8895]: CRED_ACQ pid=8895 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:22.199960 kernel: audit: type=1103 audit(1707812781.660:657): pid=8895 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:22.200038 kernel: audit: type=1106 audit(1707812781.735:658): pid=8892 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.735000 audit[8892]: USER_END pid=8892 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:22.295275 kernel: audit: type=1104 audit(1707812781.736:659): pid=8892 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:21.736000 audit[8892]: CRED_DISP pid=8892 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:22.370562 env[1563]: time="2024-02-13T08:26:22.370507470Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:22.371346 env[1563]: time="2024-02-13T08:26:22.371286372Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:848c5b919e8d33dbad8c8c64aa6aec07c29cfe6e4f6312ceafc1641ea929f91a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:22.372126 env[1563]: time="2024-02-13T08:26:22.372076516Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:22.373865 env[1563]: time="2024-02-13T08:26:22.373826117Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver@sha256:5ff0bdc8d0b2e9d7819703b18867f60f9153ed01da81e2bbfa22002abec9dc26,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:26:22.374977 env[1563]: time="2024-02-13T08:26:22.374910568Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.27.0\" returns image reference \"sha256:848c5b919e8d33dbad8c8c64aa6aec07c29cfe6e4f6312ceafc1641ea929f91a\"" Feb 13 08:26:22.375890 env[1563]: time="2024-02-13T08:26:22.375874647Z" level=info msg="CreateContainer within sandbox \"ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 08:26:21.736000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.67.79:22-139.178.68.195:55738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:22.385770 env[1563]: time="2024-02-13T08:26:22.385726920Z" level=info msg="CreateContainer within sandbox \"ff4f32cafe9b88b1afb1bf7d20bdb6f4f80050c40e3e0ea551b721b8caa60ed4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1baeae76567ca07f1bb4b17025ba024d7d01791a7679c72b338b8dfa5bb64271\"" Feb 13 08:26:22.385932 env[1563]: time="2024-02-13T08:26:22.385919963Z" level=info msg="StartContainer for \"1baeae76567ca07f1bb4b17025ba024d7d01791a7679c72b338b8dfa5bb64271\"" Feb 13 08:26:22.417664 env[1563]: time="2024-02-13T08:26:22.417636957Z" level=info msg="StartContainer for \"1baeae76567ca07f1bb4b17025ba024d7d01791a7679c72b338b8dfa5bb64271\" returns successfully" Feb 13 08:26:23.036667 kubelet[2738]: I0213 08:26:23.036608 2738 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-bf6ccd78c-n22rv" podStartSLOduration=-9.22337200781829e+09 pod.CreationTimestamp="2024-02-13 08:25:54 +0000 UTC" firstStartedPulling="2024-02-13 08:25:55.875502826 +0000 UTC m=+495.402411893" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:26:23.035522667 +0000 UTC m=+522.562431796" watchObservedRunningTime="2024-02-13 08:26:23.036484263 +0000 UTC m=+522.563393377" Feb 13 08:26:23.126000 audit[8992]: NETFILTER_CFG table=filter:132 family=2 entries=8 op=nft_register_rule pid=8992 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:26:23.126000 audit[8992]: SYSCALL arch=c000003e syscall=46 success=yes exit=2620 a0=3 a1=7fff4e100f40 a2=0 a3=7fff4e100f2c items=0 ppid=3000 pid=8992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:23.126000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:26:23.130000 audit[8992]: NETFILTER_CFG table=nat:133 family=2 entries=78 op=nft_register_rule pid=8992 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:26:23.130000 audit[8992]: SYSCALL arch=c000003e syscall=46 success=yes exit=24988 a0=3 a1=7fff4e100f40 a2=0 a3=7fff4e100f2c items=0 ppid=3000 pid=8992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:23.130000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:26:26.743970 systemd[1]: Started sshd@46-145.40.67.79:22-139.178.68.195:33318.service. Feb 13 08:26:26.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.67.79:22-139.178.68.195:33318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:26.770805 kernel: kauditd_printk_skb: 7 callbacks suppressed Feb 13 08:26:26.770833 kernel: audit: type=1130 audit(1707812786.743:663): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.67.79:22-139.178.68.195:33318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:26.879000 audit[9032]: USER_ACCT pid=9032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:26.880558 sshd[9032]: Accepted publickey for core from 139.178.68.195 port 33318 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:26.882319 sshd[9032]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:26.885109 systemd-logind[1548]: New session 47 of user core. Feb 13 08:26:26.885671 systemd[1]: Started session-47.scope. Feb 13 08:26:26.881000 audit[9032]: CRED_ACQ pid=9032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.062077 kernel: audit: type=1101 audit(1707812786.879:664): pid=9032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.062110 kernel: audit: type=1103 audit(1707812786.881:665): pid=9032 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.062129 kernel: audit: type=1006 audit(1707812786.881:666): pid=9032 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=47 res=1 Feb 13 08:26:27.098310 sshd[9032]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:27.099616 systemd[1]: sshd@46-145.40.67.79:22-139.178.68.195:33318.service: Deactivated successfully. Feb 13 08:26:27.100226 systemd-logind[1548]: Session 47 logged out. Waiting for processes to exit. Feb 13 08:26:27.100237 systemd[1]: session-47.scope: Deactivated successfully. Feb 13 08:26:27.100712 systemd-logind[1548]: Removed session 47. Feb 13 08:26:26.881000 audit[9032]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe9052d2d0 a2=3 a3=0 items=0 ppid=1 pid=9032 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:27.212268 kernel: audit: type=1300 audit(1707812786.881:666): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe9052d2d0 a2=3 a3=0 items=0 ppid=1 pid=9032 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:27.212310 kernel: audit: type=1327 audit(1707812786.881:666): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:26.881000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:27.242634 kernel: audit: type=1105 audit(1707812786.887:667): pid=9032 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:26.887000 audit[9032]: USER_START pid=9032 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.336892 kernel: audit: type=1103 audit(1707812786.888:668): pid=9035 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:26.888000 audit[9035]: CRED_ACQ pid=9035 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.097000 audit[9032]: USER_END pid=9032 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.521214 kernel: audit: type=1106 audit(1707812787.097:669): pid=9032 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.521247 kernel: audit: type=1104 audit(1707812787.097:670): pid=9032 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.097000 audit[9032]: CRED_DISP pid=9032 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:27.098000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.67.79:22-139.178.68.195:33318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:32.104665 systemd[1]: Started sshd@47-145.40.67.79:22-139.178.68.195:33328.service. Feb 13 08:26:32.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.67.79:22-139.178.68.195:33328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:32.131444 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:32.131498 kernel: audit: type=1130 audit(1707812792.103:672): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.67.79:22-139.178.68.195:33328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:32.240000 audit[9059]: USER_ACCT pid=9059 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.241397 sshd[9059]: Accepted publickey for core from 139.178.68.195 port 33328 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:32.243289 sshd[9059]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:32.245962 systemd-logind[1548]: New session 48 of user core. Feb 13 08:26:32.246496 systemd[1]: Started session-48.scope. Feb 13 08:26:32.332717 sshd[9059]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:32.242000 audit[9059]: CRED_ACQ pid=9059 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.334117 systemd[1]: sshd@47-145.40.67.79:22-139.178.68.195:33328.service: Deactivated successfully. Feb 13 08:26:32.334702 systemd-logind[1548]: Session 48 logged out. Waiting for processes to exit. Feb 13 08:26:32.334731 systemd[1]: session-48.scope: Deactivated successfully. Feb 13 08:26:32.335343 systemd-logind[1548]: Removed session 48. Feb 13 08:26:32.424088 kernel: audit: type=1101 audit(1707812792.240:673): pid=9059 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.424120 kernel: audit: type=1103 audit(1707812792.242:674): pid=9059 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.424134 kernel: audit: type=1006 audit(1707812792.242:675): pid=9059 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Feb 13 08:26:32.482551 kernel: audit: type=1300 audit(1707812792.242:675): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe19b03d00 a2=3 a3=0 items=0 ppid=1 pid=9059 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:32.242000 audit[9059]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe19b03d00 a2=3 a3=0 items=0 ppid=1 pid=9059 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:32.574399 kernel: audit: type=1327 audit(1707812792.242:675): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:32.242000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:32.604789 kernel: audit: type=1105 audit(1707812792.247:676): pid=9059 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.247000 audit[9059]: USER_START pid=9059 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.248000 audit[9062]: CRED_ACQ pid=9062 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.788140 kernel: audit: type=1103 audit(1707812792.248:677): pid=9062 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.788183 kernel: audit: type=1106 audit(1707812792.332:678): pid=9059 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.332000 audit[9059]: USER_END pid=9059 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.883519 kernel: audit: type=1104 audit(1707812792.332:679): pid=9059 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.332000 audit[9059]: CRED_DISP pid=9059 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:32.333000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.67.79:22-139.178.68.195:33328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:37.339505 systemd[1]: Started sshd@48-145.40.67.79:22-139.178.68.195:52982.service. Feb 13 08:26:37.338000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.67.79:22-139.178.68.195:52982 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:37.366499 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:37.366544 kernel: audit: type=1130 audit(1707812797.338:681): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.67.79:22-139.178.68.195:52982 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:37.475000 audit[9118]: USER_ACCT pid=9118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.477195 sshd[9118]: Accepted publickey for core from 139.178.68.195 port 52982 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:37.480737 sshd[9118]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:37.484709 systemd-logind[1548]: New session 49 of user core. Feb 13 08:26:37.485273 systemd[1]: Started session-49.scope. Feb 13 08:26:37.565574 sshd[9118]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:37.566869 systemd[1]: sshd@48-145.40.67.79:22-139.178.68.195:52982.service: Deactivated successfully. Feb 13 08:26:37.567555 systemd[1]: session-49.scope: Deactivated successfully. Feb 13 08:26:37.567584 systemd-logind[1548]: Session 49 logged out. Waiting for processes to exit. Feb 13 08:26:37.479000 audit[9118]: CRED_ACQ pid=9118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.568406 systemd-logind[1548]: Removed session 49. Feb 13 08:26:37.658802 kernel: audit: type=1101 audit(1707812797.475:682): pid=9118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.658835 kernel: audit: type=1103 audit(1707812797.479:683): pid=9118 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.658847 kernel: audit: type=1006 audit(1707812797.479:684): pid=9118 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Feb 13 08:26:37.717256 kernel: audit: type=1300 audit(1707812797.479:684): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff101b4350 a2=3 a3=0 items=0 ppid=1 pid=9118 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:37.479000 audit[9118]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff101b4350 a2=3 a3=0 items=0 ppid=1 pid=9118 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:37.809128 kernel: audit: type=1327 audit(1707812797.479:684): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:37.479000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:37.839542 kernel: audit: type=1105 audit(1707812797.486:685): pid=9118 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.486000 audit[9118]: USER_START pid=9118 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.933887 kernel: audit: type=1103 audit(1707812797.487:686): pid=9121 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.487000 audit[9121]: CRED_ACQ pid=9121 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:38.022981 kernel: audit: type=1106 audit(1707812797.565:687): pid=9118 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.565000 audit[9118]: USER_END pid=9118 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:38.118439 kernel: audit: type=1104 audit(1707812797.565:688): pid=9118 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.565000 audit[9118]: CRED_DISP pid=9118 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:37.566000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.67.79:22-139.178.68.195:52982 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:42.572740 systemd[1]: Started sshd@49-145.40.67.79:22-139.178.68.195:52994.service. Feb 13 08:26:42.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.67.79:22-139.178.68.195:52994 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:42.600002 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:42.600054 kernel: audit: type=1130 audit(1707812802.571:690): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.67.79:22-139.178.68.195:52994 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:42.710000 audit[9148]: USER_ACCT pid=9148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.712009 sshd[9148]: Accepted publickey for core from 139.178.68.195 port 52994 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:42.714975 sshd[9148]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:42.719706 systemd-logind[1548]: New session 50 of user core. Feb 13 08:26:42.720211 systemd[1]: Started session-50.scope. Feb 13 08:26:42.798213 sshd[9148]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:42.799628 systemd[1]: sshd@49-145.40.67.79:22-139.178.68.195:52994.service: Deactivated successfully. Feb 13 08:26:42.800245 systemd-logind[1548]: Session 50 logged out. Waiting for processes to exit. Feb 13 08:26:42.800271 systemd[1]: session-50.scope: Deactivated successfully. Feb 13 08:26:42.800946 systemd-logind[1548]: Removed session 50. Feb 13 08:26:42.713000 audit[9148]: CRED_ACQ pid=9148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.893427 kernel: audit: type=1101 audit(1707812802.710:691): pid=9148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.893463 kernel: audit: type=1103 audit(1707812802.713:692): pid=9148 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.893482 kernel: audit: type=1006 audit(1707812802.713:693): pid=9148 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Feb 13 08:26:42.951904 kernel: audit: type=1300 audit(1707812802.713:693): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4466c540 a2=3 a3=0 items=0 ppid=1 pid=9148 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:42.713000 audit[9148]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4466c540 a2=3 a3=0 items=0 ppid=1 pid=9148 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:43.043828 kernel: audit: type=1327 audit(1707812802.713:693): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:42.713000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:43.074281 kernel: audit: type=1105 audit(1707812802.721:694): pid=9148 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.721000 audit[9148]: USER_START pid=9148 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:43.168658 kernel: audit: type=1103 audit(1707812802.722:695): pid=9151 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.722000 audit[9151]: CRED_ACQ pid=9151 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:43.257856 kernel: audit: type=1106 audit(1707812802.797:696): pid=9148 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.797000 audit[9148]: USER_END pid=9148 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:43.353346 kernel: audit: type=1104 audit(1707812802.797:697): pid=9148 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.797000 audit[9148]: CRED_DISP pid=9148 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:42.798000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.67.79:22-139.178.68.195:52994 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:47.805506 systemd[1]: Started sshd@50-145.40.67.79:22-139.178.68.195:49380.service. Feb 13 08:26:47.804000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.67.79:22-139.178.68.195:49380 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:47.832767 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:47.832861 kernel: audit: type=1130 audit(1707812807.804:699): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.67.79:22-139.178.68.195:49380 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:47.942000 audit[9174]: USER_ACCT pid=9174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:47.943457 sshd[9174]: Accepted publickey for core from 139.178.68.195 port 49380 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:47.944271 sshd[9174]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:47.946659 systemd-logind[1548]: New session 51 of user core. Feb 13 08:26:47.947079 systemd[1]: Started session-51.scope. Feb 13 08:26:48.025762 sshd[9174]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:48.027179 systemd[1]: sshd@50-145.40.67.79:22-139.178.68.195:49380.service: Deactivated successfully. Feb 13 08:26:48.027782 systemd-logind[1548]: Session 51 logged out. Waiting for processes to exit. Feb 13 08:26:48.027793 systemd[1]: session-51.scope: Deactivated successfully. Feb 13 08:26:48.028358 systemd-logind[1548]: Removed session 51. Feb 13 08:26:47.943000 audit[9174]: CRED_ACQ pid=9174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.035006 kernel: audit: type=1101 audit(1707812807.942:700): pid=9174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.035057 kernel: audit: type=1103 audit(1707812807.943:701): pid=9174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.183418 kernel: audit: type=1006 audit(1707812807.943:702): pid=9174 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Feb 13 08:26:48.183493 kernel: audit: type=1300 audit(1707812807.943:702): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf8719d30 a2=3 a3=0 items=0 ppid=1 pid=9174 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:47.943000 audit[9174]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf8719d30 a2=3 a3=0 items=0 ppid=1 pid=9174 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:47.943000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:48.305859 kernel: audit: type=1327 audit(1707812807.943:702): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:47.948000 audit[9174]: USER_START pid=9174 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.400203 kernel: audit: type=1105 audit(1707812807.948:703): pid=9174 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:47.948000 audit[9177]: CRED_ACQ pid=9177 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.489278 kernel: audit: type=1103 audit(1707812807.948:704): pid=9177 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.489318 kernel: audit: type=1106 audit(1707812808.025:705): pid=9174 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.025000 audit[9174]: USER_END pid=9174 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.025000 audit[9174]: CRED_DISP pid=9174 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.673884 kernel: audit: type=1104 audit(1707812808.025:706): pid=9174 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:48.026000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.67.79:22-139.178.68.195:49380 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:53.032281 systemd[1]: Started sshd@51-145.40.67.79:22-139.178.68.195:49386.service. Feb 13 08:26:53.031000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.67.79:22-139.178.68.195:49386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:53.059339 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:26:53.059372 kernel: audit: type=1130 audit(1707812813.031:708): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.67.79:22-139.178.68.195:49386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:53.167000 audit[9215]: USER_ACCT pid=9215 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.168733 sshd[9215]: Accepted publickey for core from 139.178.68.195 port 49386 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:53.170286 sshd[9215]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:53.172339 systemd-logind[1548]: New session 52 of user core. Feb 13 08:26:53.172799 systemd[1]: Started session-52.scope. Feb 13 08:26:53.169000 audit[9215]: CRED_ACQ pid=9215 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.262593 sshd[9215]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:53.264033 systemd[1]: sshd@51-145.40.67.79:22-139.178.68.195:49386.service: Deactivated successfully. Feb 13 08:26:53.264846 systemd-logind[1548]: Session 52 logged out. Waiting for processes to exit. Feb 13 08:26:53.264908 systemd[1]: session-52.scope: Deactivated successfully. Feb 13 08:26:53.265454 systemd-logind[1548]: Removed session 52. Feb 13 08:26:53.350403 kernel: audit: type=1101 audit(1707812813.167:709): pid=9215 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.350443 kernel: audit: type=1103 audit(1707812813.169:710): pid=9215 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.350459 kernel: audit: type=1006 audit(1707812813.169:711): pid=9215 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Feb 13 08:26:53.408885 kernel: audit: type=1300 audit(1707812813.169:711): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3b624a10 a2=3 a3=0 items=0 ppid=1 pid=9215 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:53.169000 audit[9215]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3b624a10 a2=3 a3=0 items=0 ppid=1 pid=9215 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:53.500821 kernel: audit: type=1327 audit(1707812813.169:711): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:53.169000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:53.531248 kernel: audit: type=1105 audit(1707812813.173:712): pid=9215 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.173000 audit[9215]: USER_START pid=9215 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.625625 kernel: audit: type=1103 audit(1707812813.174:713): pid=9218 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.174000 audit[9218]: CRED_ACQ pid=9218 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.714709 kernel: audit: type=1106 audit(1707812813.262:714): pid=9215 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.262000 audit[9215]: USER_END pid=9215 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.810128 kernel: audit: type=1104 audit(1707812813.262:715): pid=9215 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.262000 audit[9215]: CRED_DISP pid=9215 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:53.871146 systemd[1]: Started sshd@52-145.40.67.79:22-160.251.212.122:37812.service. Feb 13 08:26:53.875718 sshd[9240]: kex_exchange_identification: Connection closed by remote host Feb 13 08:26:53.875718 sshd[9240]: Connection closed by 160.251.212.122 port 37812 Feb 13 08:26:53.876225 systemd[1]: sshd@52-145.40.67.79:22-160.251.212.122:37812.service: Deactivated successfully. Feb 13 08:26:53.263000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.67.79:22-139.178.68.195:49386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:53.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.67.79:22-160.251.212.122:37812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:53.875000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.67.79:22-160.251.212.122:37812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:53.995606 systemd[1]: Started sshd@53-145.40.67.79:22-160.251.212.122:37816.service. Feb 13 08:26:53.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.67.79:22-160.251.212.122:37816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:54.524675 sshd[9243]: Invalid user admin from 160.251.212.122 port 37816 Feb 13 08:26:54.651657 sshd[9243]: pam_faillock(sshd:auth): User unknown Feb 13 08:26:54.651953 sshd[9243]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:26:54.651980 sshd[9243]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:26:54.652266 sshd[9243]: pam_faillock(sshd:auth): User unknown Feb 13 08:26:54.651000 audit[9243]: USER_AUTH pid=9243 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="admin" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:26:55.142000 audit[9310]: NETFILTER_CFG table=filter:134 family=2 entries=7 op=nft_register_rule pid=9310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:26:55.142000 audit[9310]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffd9a2d39a0 a2=0 a3=7ffd9a2d398c items=0 ppid=3000 pid=9310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:55.142000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:26:55.144000 audit[9310]: NETFILTER_CFG table=nat:135 family=2 entries=85 op=nft_register_chain pid=9310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:26:55.144000 audit[9310]: SYSCALL arch=c000003e syscall=46 success=yes exit=28484 a0=3 a1=7ffd9a2d39a0 a2=0 a3=7ffd9a2d398c items=0 ppid=3000 pid=9310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:55.144000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:26:56.934365 sshd[9243]: Failed password for invalid user admin from 160.251.212.122 port 37816 ssh2 Feb 13 08:26:58.269252 systemd[1]: Started sshd@54-145.40.67.79:22-139.178.68.195:40486.service. Feb 13 08:26:58.268000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.67.79:22-139.178.68.195:40486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:58.296474 kernel: kauditd_printk_skb: 11 callbacks suppressed Feb 13 08:26:58.296536 kernel: audit: type=1130 audit(1707812818.268:723): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.67.79:22-139.178.68.195:40486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:58.405000 audit[9311]: USER_ACCT pid=9311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.406900 sshd[9311]: Accepted publickey for core from 139.178.68.195 port 40486 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:26:58.409303 sshd[9311]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:26:58.412171 systemd-logind[1548]: New session 53 of user core. Feb 13 08:26:58.413176 systemd[1]: Started session-53.scope. Feb 13 08:26:58.408000 audit[9311]: CRED_ACQ pid=9311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.501058 sshd[9311]: pam_unix(sshd:session): session closed for user core Feb 13 08:26:58.502742 systemd[1]: sshd@54-145.40.67.79:22-139.178.68.195:40486.service: Deactivated successfully. Feb 13 08:26:58.503401 systemd[1]: session-53.scope: Deactivated successfully. Feb 13 08:26:58.503447 systemd-logind[1548]: Session 53 logged out. Waiting for processes to exit. Feb 13 08:26:58.503913 systemd-logind[1548]: Removed session 53. Feb 13 08:26:58.539681 sshd[9243]: Connection closed by invalid user admin 160.251.212.122 port 37816 [preauth] Feb 13 08:26:58.540134 systemd[1]: sshd@53-145.40.67.79:22-160.251.212.122:37816.service: Deactivated successfully. Feb 13 08:26:58.588518 kernel: audit: type=1101 audit(1707812818.405:724): pid=9311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.588553 kernel: audit: type=1103 audit(1707812818.408:725): pid=9311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.588567 kernel: audit: type=1006 audit(1707812818.408:726): pid=9311 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Feb 13 08:26:58.647073 kernel: audit: type=1300 audit(1707812818.408:726): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0e798a60 a2=3 a3=0 items=0 ppid=1 pid=9311 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:58.408000 audit[9311]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0e798a60 a2=3 a3=0 items=0 ppid=1 pid=9311 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:26:58.664128 systemd[1]: Started sshd@55-145.40.67.79:22-160.251.212.122:37832.service. Feb 13 08:26:58.739036 kernel: audit: type=1327 audit(1707812818.408:726): proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:58.408000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:26:58.769439 kernel: audit: type=1105 audit(1707812818.414:727): pid=9311 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.414000 audit[9311]: USER_START pid=9311 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.863860 kernel: audit: type=1103 audit(1707812818.414:728): pid=9314 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.414000 audit[9314]: CRED_ACQ pid=9314 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.500000 audit[9311]: USER_END pid=9311 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:59.048394 kernel: audit: type=1106 audit(1707812818.500:729): pid=9311 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.500000 audit[9311]: CRED_DISP pid=9311 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:59.137676 kernel: audit: type=1104 audit(1707812818.500:730): pid=9311 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:26:58.501000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.67.79:22-139.178.68.195:40486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:58.539000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.67.79:22-160.251.212.122:37816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:58.663000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-145.40.67.79:22-160.251.212.122:37832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:26:59.293346 sshd[9339]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:26:59.292000 audit[9339]: USER_AUTH pid=9339 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:01.595814 sshd[9339]: Failed password for root from 160.251.212.122 port 37832 ssh2 Feb 13 08:27:03.259983 sshd[9339]: Connection closed by authenticating user root 160.251.212.122 port 37832 [preauth] Feb 13 08:27:03.262581 systemd[1]: sshd@55-145.40.67.79:22-160.251.212.122:37832.service: Deactivated successfully. Feb 13 08:27:03.262000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-145.40.67.79:22-160.251.212.122:37832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.387662 systemd[1]: Started sshd@56-145.40.67.79:22-160.251.212.122:50890.service. Feb 13 08:27:03.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.67.79:22-160.251.212.122:50890 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.414642 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:27:03.414684 kernel: audit: type=1130 audit(1707812823.386:736): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.67.79:22-160.251.212.122:50890 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.503360 systemd[1]: Started sshd@57-145.40.67.79:22-139.178.68.195:40502.service. Feb 13 08:27:03.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.67.79:22-139.178.68.195:40502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.592286 kernel: audit: type=1130 audit(1707812823.502:737): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.67.79:22-139.178.68.195:40502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.612000 audit[9371]: USER_ACCT pid=9371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.613134 sshd[9371]: Accepted publickey for core from 139.178.68.195 port 40502 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:03.614021 sshd[9371]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:03.616257 systemd-logind[1548]: New session 54 of user core. Feb 13 08:27:03.616750 systemd[1]: Started session-54.scope. Feb 13 08:27:03.696150 sshd[9371]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:03.697743 systemd[1]: Started sshd@58-145.40.67.79:22-139.178.68.195:40506.service. Feb 13 08:27:03.698064 systemd[1]: sshd@57-145.40.67.79:22-139.178.68.195:40502.service: Deactivated successfully. Feb 13 08:27:03.698579 systemd-logind[1548]: Session 54 logged out. Waiting for processes to exit. Feb 13 08:27:03.698624 systemd[1]: session-54.scope: Deactivated successfully. Feb 13 08:27:03.698985 systemd-logind[1548]: Removed session 54. Feb 13 08:27:03.612000 audit[9371]: CRED_ACQ pid=9371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.795349 kernel: audit: type=1101 audit(1707812823.612:738): pid=9371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.795383 kernel: audit: type=1103 audit(1707812823.612:739): pid=9371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.795397 kernel: audit: type=1006 audit(1707812823.612:740): pid=9371 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Feb 13 08:27:03.853762 kernel: audit: type=1300 audit(1707812823.612:740): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc3dfe2830 a2=3 a3=0 items=0 ppid=1 pid=9371 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:03.612000 audit[9371]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc3dfe2830 a2=3 a3=0 items=0 ppid=1 pid=9371 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:03.873913 sshd[9396]: Accepted publickey for core from 139.178.68.195 port 40506 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:03.875377 sshd[9396]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:03.877724 systemd-logind[1548]: New session 55 of user core. Feb 13 08:27:03.878113 systemd[1]: Started session-55.scope. Feb 13 08:27:03.885567 sshd[9369]: Invalid user ubnt from 160.251.212.122 port 50890 Feb 13 08:27:03.612000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:03.975971 kernel: audit: type=1327 audit(1707812823.612:740): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:03.976007 kernel: audit: type=1105 audit(1707812823.617:741): pid=9371 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.617000 audit[9371]: USER_START pid=9371 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.013156 sshd[9369]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:04.013370 sshd[9369]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:04.013389 sshd[9369]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:04.013590 sshd[9369]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:03.618000 audit[9374]: CRED_ACQ pid=9374 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.160086 update_engine[1550]: I0213 08:27:04.159993 1550 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Feb 13 08:27:04.160086 update_engine[1550]: I0213 08:27:04.160009 1550 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Feb 13 08:27:04.160257 kernel: audit: type=1103 audit(1707812823.618:742): pid=9374 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.160273 kernel: audit: type=1106 audit(1707812823.695:743): pid=9371 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.695000 audit[9371]: USER_END pid=9371 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.160389 update_engine[1550]: I0213 08:27:04.160351 1550 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Feb 13 08:27:04.160577 update_engine[1550]: I0213 08:27:04.160542 1550 omaha_request_params.cc:62] Current group set to lts Feb 13 08:27:04.160615 update_engine[1550]: I0213 08:27:04.160609 1550 update_attempter.cc:499] Already updated boot flags. Skipping. Feb 13 08:27:04.160615 update_engine[1550]: I0213 08:27:04.160612 1550 update_attempter.cc:643] Scheduling an action processor start. Feb 13 08:27:04.160654 update_engine[1550]: I0213 08:27:04.160621 1550 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 08:27:04.160654 update_engine[1550]: I0213 08:27:04.160634 1550 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Feb 13 08:27:04.160687 update_engine[1550]: I0213 08:27:04.160658 1550 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 08:27:04.160687 update_engine[1550]: I0213 08:27:04.160662 1550 omaha_request_action.cc:271] Request: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: Feb 13 08:27:04.160687 update_engine[1550]: I0213 08:27:04.160665 1550 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:27:04.160867 locksmithd[1603]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Feb 13 08:27:04.161317 update_engine[1550]: I0213 08:27:04.161281 1550 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:27:04.161350 update_engine[1550]: E0213 08:27:04.161327 1550 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:27:04.161370 update_engine[1550]: I0213 08:27:04.161357 1550 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Feb 13 08:27:03.695000 audit[9371]: CRED_DISP pid=9371 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.67.79:22-139.178.68.195:40506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.697000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.67.79:22-139.178.68.195:40502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:03.872000 audit[9396]: USER_ACCT pid=9396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.874000 audit[9396]: CRED_ACQ pid=9396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.874000 audit[9396]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd91bd4230 a2=3 a3=0 items=0 ppid=1 pid=9396 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:03.874000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:03.879000 audit[9396]: USER_START pid=9396 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:03.879000 audit[9400]: CRED_ACQ pid=9400 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.012000 audit[9369]: USER_AUTH pid=9369 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="ubnt" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:04.991575 sshd[9396]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:04.992000 audit[9396]: USER_END pid=9396 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.993000 audit[9396]: CRED_DISP pid=9396 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:04.996921 systemd[1]: Started sshd@59-145.40.67.79:22-139.178.68.195:40522.service. Feb 13 08:27:04.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.67.79:22-139.178.68.195:40522 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:04.997357 systemd[1]: sshd@58-145.40.67.79:22-139.178.68.195:40506.service: Deactivated successfully. Feb 13 08:27:04.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.67.79:22-139.178.68.195:40506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:04.998001 systemd-logind[1548]: Session 55 logged out. Waiting for processes to exit. Feb 13 08:27:04.998058 systemd[1]: session-55.scope: Deactivated successfully. Feb 13 08:27:04.998728 systemd-logind[1548]: Removed session 55. Feb 13 08:27:05.023000 audit[9421]: USER_ACCT pid=9421 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:05.024618 sshd[9421]: Accepted publickey for core from 139.178.68.195 port 40522 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:05.025000 audit[9421]: CRED_ACQ pid=9421 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:05.025000 audit[9421]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd7a933bf0 a2=3 a3=0 items=0 ppid=1 pid=9421 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:05.025000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:05.027454 sshd[9421]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:05.037065 systemd-logind[1548]: New session 56 of user core. Feb 13 08:27:05.039192 systemd[1]: Started session-56.scope. Feb 13 08:27:05.053000 audit[9421]: USER_START pid=9421 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:05.057000 audit[9425]: CRED_ACQ pid=9425 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.335442 sshd[9369]: Failed password for invalid user ubnt from 160.251.212.122 port 50890 ssh2 Feb 13 08:27:06.381788 sshd[9421]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:06.381000 audit[9421]: USER_END pid=9421 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.381000 audit[9421]: CRED_DISP pid=9421 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.383684 systemd[1]: Started sshd@60-145.40.67.79:22-139.178.68.195:36984.service. Feb 13 08:27:06.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.67.79:22-139.178.68.195:36984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:06.384078 systemd[1]: sshd@59-145.40.67.79:22-139.178.68.195:40522.service: Deactivated successfully. Feb 13 08:27:06.383000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.67.79:22-139.178.68.195:40522 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:06.384787 systemd-logind[1548]: Session 56 logged out. Waiting for processes to exit. Feb 13 08:27:06.384817 systemd[1]: session-56.scope: Deactivated successfully. Feb 13 08:27:06.385538 systemd-logind[1548]: Removed session 56. Feb 13 08:27:06.389000 audit[9476]: NETFILTER_CFG table=filter:136 family=2 entries=18 op=nft_register_rule pid=9476 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:27:06.389000 audit[9476]: SYSCALL arch=c000003e syscall=46 success=yes exit=10364 a0=3 a1=7ffe5af09cd0 a2=0 a3=7ffe5af09cbc items=0 ppid=3000 pid=9476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:06.389000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:27:06.390000 audit[9476]: NETFILTER_CFG table=nat:137 family=2 entries=88 op=nft_register_rule pid=9476 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:27:06.390000 audit[9476]: SYSCALL arch=c000003e syscall=46 success=yes exit=28484 a0=3 a1=7ffe5af09cd0 a2=0 a3=7ffe5af09cbc items=0 ppid=3000 pid=9476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:06.390000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:27:06.415000 audit[9468]: USER_ACCT pid=9468 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.416765 sshd[9468]: Accepted publickey for core from 139.178.68.195 port 36984 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:06.417000 audit[9468]: CRED_ACQ pid=9468 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.417000 audit[9468]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd9216b320 a2=3 a3=0 items=0 ppid=1 pid=9468 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:06.417000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:06.419623 sshd[9468]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:06.429907 systemd-logind[1548]: New session 57 of user core. Feb 13 08:27:06.433352 systemd[1]: Started session-57.scope. Feb 13 08:27:06.447000 audit[9468]: USER_START pid=9468 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.450000 audit[9492]: CRED_ACQ pid=9492 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.497000 audit[9505]: NETFILTER_CFG table=filter:138 family=2 entries=30 op=nft_register_rule pid=9505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:27:06.497000 audit[9505]: SYSCALL arch=c000003e syscall=46 success=yes exit=10364 a0=3 a1=7ffe7d066810 a2=0 a3=7ffe7d0667fc items=0 ppid=3000 pid=9505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:06.497000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:27:06.502000 audit[9505]: NETFILTER_CFG table=nat:139 family=2 entries=88 op=nft_register_rule pid=9505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:27:06.502000 audit[9505]: SYSCALL arch=c000003e syscall=46 success=yes exit=28484 a0=3 a1=7ffe7d066810 a2=0 a3=7ffe7d0667fc items=0 ppid=3000 pid=9505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:06.502000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:27:06.656757 sshd[9468]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:06.656000 audit[9468]: USER_END pid=9468 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.656000 audit[9468]: CRED_DISP pid=9468 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.657000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.67.79:22-139.178.68.195:37000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:06.658000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.67.79:22-139.178.68.195:36984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:06.658699 systemd[1]: Started sshd@61-145.40.67.79:22-139.178.68.195:37000.service. Feb 13 08:27:06.659083 systemd[1]: sshd@60-145.40.67.79:22-139.178.68.195:36984.service: Deactivated successfully. Feb 13 08:27:06.659626 systemd-logind[1548]: Session 57 logged out. Waiting for processes to exit. Feb 13 08:27:06.659686 systemd[1]: session-57.scope: Deactivated successfully. Feb 13 08:27:06.660181 systemd-logind[1548]: Removed session 57. Feb 13 08:27:06.684000 audit[9526]: USER_ACCT pid=9526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.684898 sshd[9526]: Accepted publickey for core from 139.178.68.195 port 37000 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:06.684000 audit[9526]: CRED_ACQ pid=9526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.684000 audit[9526]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffdec0bb90 a2=3 a3=0 items=0 ppid=1 pid=9526 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:06.684000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:06.685622 sshd[9526]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:06.688103 systemd-logind[1548]: New session 58 of user core. Feb 13 08:27:06.688782 systemd[1]: Started session-58.scope. Feb 13 08:27:06.690000 audit[9526]: USER_START pid=9526 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.690000 audit[9530]: CRED_ACQ pid=9530 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.812821 sshd[9526]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:06.812000 audit[9526]: USER_END pid=9526 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.812000 audit[9526]: CRED_DISP pid=9526 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:06.814345 systemd[1]: sshd@61-145.40.67.79:22-139.178.68.195:37000.service: Deactivated successfully. Feb 13 08:27:06.814000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.67.79:22-139.178.68.195:37000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:06.814994 systemd-logind[1548]: Session 58 logged out. Waiting for processes to exit. Feb 13 08:27:06.815012 systemd[1]: session-58.scope: Deactivated successfully. Feb 13 08:27:06.815745 systemd-logind[1548]: Removed session 58. Feb 13 08:27:07.093275 sshd[9369]: Connection closed by invalid user ubnt 160.251.212.122 port 50890 [preauth] Feb 13 08:27:07.095535 systemd[1]: sshd@56-145.40.67.79:22-160.251.212.122:50890.service: Deactivated successfully. Feb 13 08:27:07.095000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.67.79:22-160.251.212.122:50890 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:07.221875 systemd[1]: Started sshd@62-145.40.67.79:22-160.251.212.122:50904.service. Feb 13 08:27:07.221000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.67.79:22-160.251.212.122:50904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:07.868886 sshd[9556]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:27:07.869000 audit[9556]: USER_AUTH pid=9556 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:09.935214 sshd[9556]: Failed password for root from 160.251.212.122 port 50904 ssh2 Feb 13 08:27:11.821167 systemd[1]: Started sshd@63-145.40.67.79:22-139.178.68.195:37002.service. Feb 13 08:27:11.819000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.67.79:22-139.178.68.195:37002 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:11.848663 kernel: kauditd_printk_skb: 62 callbacks suppressed Feb 13 08:27:11.848743 kernel: audit: type=1130 audit(1707812831.819:790): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.67.79:22-139.178.68.195:37002 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:11.848883 sshd[9556]: Connection closed by authenticating user root 160.251.212.122 port 50904 [preauth] Feb 13 08:27:11.849369 systemd[1]: sshd@62-145.40.67.79:22-160.251.212.122:50904.service: Deactivated successfully. Feb 13 08:27:11.868953 sshd[9559]: Accepted publickey for core from 139.178.68.195 port 37002 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:11.871315 sshd[9559]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:11.873555 systemd-logind[1548]: New session 59 of user core. Feb 13 08:27:11.874023 systemd[1]: Started session-59.scope. Feb 13 08:27:11.847000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.67.79:22-160.251.212.122:50904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:11.950972 sshd[9559]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:11.952488 systemd[1]: sshd@63-145.40.67.79:22-139.178.68.195:37002.service: Deactivated successfully. Feb 13 08:27:11.953124 systemd-logind[1548]: Session 59 logged out. Waiting for processes to exit. Feb 13 08:27:11.953146 systemd[1]: session-59.scope: Deactivated successfully. Feb 13 08:27:11.953673 systemd-logind[1548]: Removed session 59. Feb 13 08:27:12.026010 kernel: audit: type=1131 audit(1707812831.847:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.67.79:22-160.251.212.122:50904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:12.026051 kernel: audit: type=1101 audit(1707812831.867:792): pid=9559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:11.867000 audit[9559]: USER_ACCT pid=9559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.027405 systemd[1]: Started sshd@64-145.40.67.79:22-160.251.212.122:50918.service. Feb 13 08:27:11.870000 audit[9559]: CRED_ACQ pid=9559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.209068 kernel: audit: type=1103 audit(1707812831.870:793): pid=9559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.209105 kernel: audit: type=1006 audit(1707812831.870:794): pid=9559 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=59 res=1 Feb 13 08:27:12.267863 kernel: audit: type=1300 audit(1707812831.870:794): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff418d8930 a2=3 a3=0 items=0 ppid=1 pid=9559 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:11.870000 audit[9559]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff418d8930 a2=3 a3=0 items=0 ppid=1 pid=9559 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:12.360141 kernel: audit: type=1327 audit(1707812831.870:794): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:11.870000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:11.874000 audit[9559]: USER_START pid=9559 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.486400 kernel: audit: type=1105 audit(1707812831.874:795): pid=9559 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.486431 kernel: audit: type=1103 audit(1707812831.874:796): pid=9564 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:11.874000 audit[9564]: CRED_ACQ pid=9564 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.575958 kernel: audit: type=1106 audit(1707812831.950:797): pid=9559 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:11.950000 audit[9559]: USER_END pid=9559 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:12.602711 sshd[9587]: Invalid user admin from 160.251.212.122 port 50918 Feb 13 08:27:11.950000 audit[9559]: CRED_DISP pid=9559 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:11.951000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.67.79:22-139.178.68.195:37002 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:12.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.67.79:22-160.251.212.122:50918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:12.739685 sshd[9587]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:12.740097 sshd[9587]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:12.740134 sshd[9587]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:12.740481 sshd[9587]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:12.739000 audit[9587]: USER_AUTH pid=9587 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="admin" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:14.162260 update_engine[1550]: I0213 08:27:14.162145 1550 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:27:14.163197 update_engine[1550]: I0213 08:27:14.162585 1550 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:27:14.163197 update_engine[1550]: E0213 08:27:14.162782 1550 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:27:14.163197 update_engine[1550]: I0213 08:27:14.162950 1550 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Feb 13 08:27:15.493870 sshd[9587]: Failed password for invalid user admin from 160.251.212.122 port 50918 ssh2 Feb 13 08:27:16.635089 sshd[9587]: Connection closed by invalid user admin 160.251.212.122 port 50918 [preauth] Feb 13 08:27:16.637726 systemd[1]: sshd@64-145.40.67.79:22-160.251.212.122:50918.service: Deactivated successfully. Feb 13 08:27:16.637000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.67.79:22-160.251.212.122:50918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:16.757413 systemd[1]: Started sshd@65-145.40.67.79:22-160.251.212.122:35904.service. Feb 13 08:27:16.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.67.79:22-160.251.212.122:35904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:16.960028 systemd[1]: Started sshd@66-145.40.67.79:22-139.178.68.195:32840.service. Feb 13 08:27:16.959000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.67.79:22-139.178.68.195:32840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:16.988379 kernel: kauditd_printk_skb: 6 callbacks suppressed Feb 13 08:27:16.988557 kernel: audit: type=1130 audit(1707812836.959:804): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.67.79:22-139.178.68.195:32840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:17.008749 sshd[9598]: Accepted publickey for core from 139.178.68.195 port 32840 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:17.010314 sshd[9598]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:17.012669 systemd-logind[1548]: New session 60 of user core. Feb 13 08:27:17.013270 systemd[1]: Started session-60.scope. Feb 13 08:27:17.007000 audit[9598]: USER_ACCT pid=9598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.079135 kernel: audit: type=1101 audit(1707812837.007:805): pid=9598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.091921 sshd[9598]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:17.093448 systemd[1]: sshd@66-145.40.67.79:22-139.178.68.195:32840.service: Deactivated successfully. Feb 13 08:27:17.094063 systemd[1]: session-60.scope: Deactivated successfully. Feb 13 08:27:17.094112 systemd-logind[1548]: Session 60 logged out. Waiting for processes to exit. Feb 13 08:27:17.094625 systemd-logind[1548]: Removed session 60. Feb 13 08:27:17.009000 audit[9598]: CRED_ACQ pid=9598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.261732 kernel: audit: type=1103 audit(1707812837.009:806): pid=9598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.261768 kernel: audit: type=1006 audit(1707812837.009:807): pid=9598 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=60 res=1 Feb 13 08:27:17.277452 sshd[9596]: Invalid user ansible from 160.251.212.122 port 35904 Feb 13 08:27:17.320805 kernel: audit: type=1300 audit(1707812837.009:807): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca6de24b0 a2=3 a3=0 items=0 ppid=1 pid=9598 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:17.009000 audit[9598]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca6de24b0 a2=3 a3=0 items=0 ppid=1 pid=9598 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:17.407388 sshd[9596]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:17.407594 sshd[9596]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:17.407612 sshd[9596]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:17.407791 sshd[9596]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:17.413640 kernel: audit: type=1327 audit(1707812837.009:807): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:17.009000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:17.444354 kernel: audit: type=1105 audit(1707812837.014:808): pid=9598 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.014000 audit[9598]: USER_START pid=9598 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.539669 kernel: audit: type=1103 audit(1707812837.014:809): pid=9601 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.014000 audit[9601]: CRED_ACQ pid=9601 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.091000 audit[9598]: USER_END pid=9598 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.725258 kernel: audit: type=1106 audit(1707812837.091:810): pid=9598 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.725294 kernel: audit: type=1104 audit(1707812837.091:811): pid=9598 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.091000 audit[9598]: CRED_DISP pid=9598 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:17.092000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.67.79:22-139.178.68.195:32840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:17.406000 audit[9596]: USER_AUTH pid=9596 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="ansible" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:19.845331 sshd[9596]: Failed password for invalid user ansible from 160.251.212.122 port 35904 ssh2 Feb 13 08:27:21.710480 sshd[9596]: Connection closed by invalid user ansible 160.251.212.122 port 35904 [preauth] Feb 13 08:27:21.712976 systemd[1]: sshd@65-145.40.67.79:22-160.251.212.122:35904.service: Deactivated successfully. Feb 13 08:27:21.712000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.67.79:22-160.251.212.122:35904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:21.842439 systemd[1]: Started sshd@67-145.40.67.79:22-160.251.212.122:35908.service. Feb 13 08:27:21.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.67.79:22-160.251.212.122:35908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:22.101365 systemd[1]: Started sshd@68-145.40.67.79:22-139.178.68.195:32856.service. Feb 13 08:27:22.100000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.67.79:22-139.178.68.195:32856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:22.128870 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:27:22.128908 kernel: audit: type=1130 audit(1707812842.100:816): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.67.79:22-139.178.68.195:32856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:22.149203 sshd[9631]: Accepted publickey for core from 139.178.68.195 port 32856 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:22.151256 sshd[9631]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:22.153705 systemd-logind[1548]: New session 61 of user core. Feb 13 08:27:22.154552 systemd[1]: Started session-61.scope. Feb 13 08:27:22.148000 audit[9631]: USER_ACCT pid=9631 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.231982 sshd[9631]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:22.233313 systemd[1]: sshd@68-145.40.67.79:22-139.178.68.195:32856.service: Deactivated successfully. Feb 13 08:27:22.233924 systemd-logind[1548]: Session 61 logged out. Waiting for processes to exit. Feb 13 08:27:22.233935 systemd[1]: session-61.scope: Deactivated successfully. Feb 13 08:27:22.234570 systemd-logind[1548]: Removed session 61. Feb 13 08:27:22.311144 kernel: audit: type=1101 audit(1707812842.148:817): pid=9631 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.311178 kernel: audit: type=1103 audit(1707812842.150:818): pid=9631 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.150000 audit[9631]: CRED_ACQ pid=9631 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.401328 kernel: audit: type=1006 audit(1707812842.150:819): pid=9631 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=61 res=1 Feb 13 08:27:22.460169 kernel: audit: type=1300 audit(1707812842.150:819): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5f0e3a70 a2=3 a3=0 items=0 ppid=1 pid=9631 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:22.150000 audit[9631]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5f0e3a70 a2=3 a3=0 items=0 ppid=1 pid=9631 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:22.481179 sshd[9629]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:27:22.150000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:22.583081 kernel: audit: type=1327 audit(1707812842.150:819): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:22.583112 kernel: audit: type=1105 audit(1707812842.155:820): pid=9631 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.155000 audit[9631]: USER_START pid=9631 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.156000 audit[9634]: CRED_ACQ pid=9634 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.767129 kernel: audit: type=1103 audit(1707812842.156:821): pid=9634 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.767164 kernel: audit: type=1106 audit(1707812842.231:822): pid=9631 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.231000 audit[9631]: USER_END pid=9631 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.231000 audit[9631]: CRED_DISP pid=9631 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.951890 kernel: audit: type=1104 audit(1707812842.231:823): pid=9631 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:22.232000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.67.79:22-139.178.68.195:32856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:22.480000 audit[9629]: USER_AUTH pid=9629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:24.161771 update_engine[1550]: I0213 08:27:24.161648 1550 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:27:24.162616 update_engine[1550]: I0213 08:27:24.162147 1550 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:27:24.162616 update_engine[1550]: E0213 08:27:24.162343 1550 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:27:24.162616 update_engine[1550]: I0213 08:27:24.162505 1550 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Feb 13 08:27:24.607943 sshd[9629]: Failed password for root from 160.251.212.122 port 35908 ssh2 Feb 13 08:27:26.448455 sshd[9629]: Connection closed by authenticating user root 160.251.212.122 port 35908 [preauth] Feb 13 08:27:26.451032 systemd[1]: sshd@67-145.40.67.79:22-160.251.212.122:35908.service: Deactivated successfully. Feb 13 08:27:26.450000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.67.79:22-160.251.212.122:35908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:26.577211 systemd[1]: Started sshd@69-145.40.67.79:22-160.251.212.122:58366.service. Feb 13 08:27:26.576000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.67.79:22-160.251.212.122:58366 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:27.087399 sshd[9718]: Invalid user user from 160.251.212.122 port 58366 Feb 13 08:27:27.217083 sshd[9718]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:27.218256 sshd[9718]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:27.218346 sshd[9718]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:27.219311 sshd[9718]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:27.218000 audit[9718]: USER_AUTH pid=9718 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="user" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:27.236614 systemd[1]: Started sshd@70-145.40.67.79:22-139.178.68.195:44882.service. Feb 13 08:27:27.247028 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:27:27.247069 kernel: audit: type=1100 audit(1707812847.218:828): pid=9718 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="user" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:27.267301 sshd[9720]: Accepted publickey for core from 139.178.68.195 port 44882 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:27.268525 sshd[9720]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:27.271283 systemd-logind[1548]: New session 62 of user core. Feb 13 08:27:27.272131 systemd[1]: Started session-62.scope. Feb 13 08:27:27.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.67.79:22-139.178.68.195:44882 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:27.349756 sshd[9720]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:27.351228 systemd[1]: sshd@70-145.40.67.79:22-139.178.68.195:44882.service: Deactivated successfully. Feb 13 08:27:27.351852 systemd-logind[1548]: Session 62 logged out. Waiting for processes to exit. Feb 13 08:27:27.351890 systemd[1]: session-62.scope: Deactivated successfully. Feb 13 08:27:27.352421 systemd-logind[1548]: Removed session 62. Feb 13 08:27:27.424164 kernel: audit: type=1130 audit(1707812847.235:829): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.67.79:22-139.178.68.195:44882 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:27.424201 kernel: audit: type=1101 audit(1707812847.266:830): pid=9720 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.266000 audit[9720]: USER_ACCT pid=9720 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.267000 audit[9720]: CRED_ACQ pid=9720 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.606449 kernel: audit: type=1103 audit(1707812847.267:831): pid=9720 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.606481 kernel: audit: type=1006 audit(1707812847.267:832): pid=9720 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Feb 13 08:27:27.664915 kernel: audit: type=1300 audit(1707812847.267:832): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc35628310 a2=3 a3=0 items=0 ppid=1 pid=9720 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:27.267000 audit[9720]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc35628310 a2=3 a3=0 items=0 ppid=1 pid=9720 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:27.756883 kernel: audit: type=1327 audit(1707812847.267:832): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:27.267000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:27.787282 kernel: audit: type=1105 audit(1707812847.273:833): pid=9720 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.273000 audit[9720]: USER_START pid=9720 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.882601 kernel: audit: type=1103 audit(1707812847.274:834): pid=9723 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.274000 audit[9723]: CRED_ACQ pid=9723 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.971738 kernel: audit: type=1106 audit(1707812847.349:835): pid=9720 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.349000 audit[9720]: USER_END pid=9720 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.349000 audit[9720]: CRED_DISP pid=9720 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:27.350000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.67.79:22-139.178.68.195:44882 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:29.029210 sshd[9718]: Failed password for invalid user user from 160.251.212.122 port 58366 ssh2 Feb 13 08:27:29.914674 sshd[9718]: Connection closed by invalid user user 160.251.212.122 port 58366 [preauth] Feb 13 08:27:29.917250 systemd[1]: sshd@69-145.40.67.79:22-160.251.212.122:58366.service: Deactivated successfully. Feb 13 08:27:29.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.67.79:22-160.251.212.122:58366 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:30.046903 systemd[1]: Started sshd@71-145.40.67.79:22-160.251.212.122:58380.service. Feb 13 08:27:30.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.67.79:22-160.251.212.122:58380 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:30.579125 sshd[9748]: Invalid user pi from 160.251.212.122 port 58380 Feb 13 08:27:30.709963 sshd[9748]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:30.710946 sshd[9748]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:30.711064 sshd[9748]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:30.711974 sshd[9748]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:30.711000 audit[9748]: USER_AUTH pid=9748 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="pi" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:32.356619 systemd[1]: Started sshd@72-145.40.67.79:22-139.178.68.195:44896.service. Feb 13 08:27:32.355000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.67.79:22-139.178.68.195:44896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:32.384038 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:27:32.384096 kernel: audit: type=1130 audit(1707812852.355:841): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.67.79:22-139.178.68.195:44896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:32.493000 audit[9751]: USER_ACCT pid=9751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.494444 sshd[9751]: Accepted publickey for core from 139.178.68.195 port 44896 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:32.495541 sshd[9751]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:32.497850 systemd-logind[1548]: New session 63 of user core. Feb 13 08:27:32.498555 systemd[1]: Started session-63.scope. Feb 13 08:27:32.577085 sshd[9751]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:32.578524 systemd[1]: sshd@72-145.40.67.79:22-139.178.68.195:44896.service: Deactivated successfully. Feb 13 08:27:32.579159 systemd-logind[1548]: Session 63 logged out. Waiting for processes to exit. Feb 13 08:27:32.579189 systemd[1]: session-63.scope: Deactivated successfully. Feb 13 08:27:32.579707 systemd-logind[1548]: Removed session 63. Feb 13 08:27:32.494000 audit[9751]: CRED_ACQ pid=9751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.676930 kernel: audit: type=1101 audit(1707812852.493:842): pid=9751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.676972 kernel: audit: type=1103 audit(1707812852.494:843): pid=9751 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.676994 kernel: audit: type=1006 audit(1707812852.494:844): pid=9751 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Feb 13 08:27:32.735725 kernel: audit: type=1300 audit(1707812852.494:844): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0b246d30 a2=3 a3=0 items=0 ppid=1 pid=9751 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:32.494000 audit[9751]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0b246d30 a2=3 a3=0 items=0 ppid=1 pid=9751 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:32.494000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:32.858693 kernel: audit: type=1327 audit(1707812852.494:844): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:32.858769 kernel: audit: type=1105 audit(1707812852.499:845): pid=9751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.499000 audit[9751]: USER_START pid=9751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.934064 sshd[9748]: Failed password for invalid user pi from 160.251.212.122 port 58380 ssh2 Feb 13 08:27:32.953588 kernel: audit: type=1103 audit(1707812852.500:846): pid=9754 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.500000 audit[9754]: CRED_ACQ pid=9754 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.576000 audit[9751]: USER_END pid=9751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:33.138894 kernel: audit: type=1106 audit(1707812852.576:847): pid=9751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:33.138919 kernel: audit: type=1104 audit(1707812852.576:848): pid=9751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.576000 audit[9751]: CRED_DISP pid=9751 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:32.577000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.67.79:22-139.178.68.195:44896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:34.155601 sshd[9748]: Connection closed by invalid user pi 160.251.212.122 port 58380 [preauth] Feb 13 08:27:34.158091 systemd[1]: sshd@71-145.40.67.79:22-160.251.212.122:58380.service: Deactivated successfully. Feb 13 08:27:34.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.67.79:22-160.251.212.122:58380 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:34.161174 update_engine[1550]: I0213 08:27:34.161065 1550 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:27:34.161917 update_engine[1550]: I0213 08:27:34.161604 1550 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:27:34.161917 update_engine[1550]: E0213 08:27:34.161857 1550 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:27:34.162155 update_engine[1550]: I0213 08:27:34.162098 1550 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 08:27:34.162155 update_engine[1550]: I0213 08:27:34.162122 1550 omaha_request_action.cc:621] Omaha request response: Feb 13 08:27:34.162353 update_engine[1550]: E0213 08:27:34.162307 1550 omaha_request_action.cc:640] Omaha request network transfer failed. Feb 13 08:27:34.162353 update_engine[1550]: I0213 08:27:34.162346 1550 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Feb 13 08:27:34.162554 update_engine[1550]: I0213 08:27:34.162364 1550 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:27:34.162554 update_engine[1550]: I0213 08:27:34.162379 1550 update_attempter.cc:306] Processing Done. Feb 13 08:27:34.162554 update_engine[1550]: E0213 08:27:34.162412 1550 update_attempter.cc:619] Update failed. Feb 13 08:27:34.162554 update_engine[1550]: I0213 08:27:34.162429 1550 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Feb 13 08:27:34.162554 update_engine[1550]: I0213 08:27:34.162446 1550 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Feb 13 08:27:34.162554 update_engine[1550]: I0213 08:27:34.162461 1550 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Feb 13 08:27:34.162963 update_engine[1550]: I0213 08:27:34.162673 1550 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 08:27:34.162963 update_engine[1550]: I0213 08:27:34.162746 1550 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 08:27:34.162963 update_engine[1550]: I0213 08:27:34.162765 1550 omaha_request_action.cc:271] Request: Feb 13 08:27:34.162963 update_engine[1550]: Feb 13 08:27:34.162963 update_engine[1550]: Feb 13 08:27:34.162963 update_engine[1550]: Feb 13 08:27:34.162963 update_engine[1550]: Feb 13 08:27:34.162963 update_engine[1550]: Feb 13 08:27:34.162963 update_engine[1550]: Feb 13 08:27:34.162963 update_engine[1550]: I0213 08:27:34.162781 1550 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:27:34.162963 update_engine[1550]: I0213 08:27:34.162957 1550 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:27:34.163183 update_engine[1550]: E0213 08:27:34.163022 1550 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163085 1550 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163089 1550 omaha_request_action.cc:621] Omaha request response: Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163091 1550 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163094 1550 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163096 1550 update_attempter.cc:306] Processing Done. Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163098 1550 update_attempter.cc:310] Error event sent. Feb 13 08:27:34.163183 update_engine[1550]: I0213 08:27:34.163106 1550 update_check_scheduler.cc:74] Next update check in 43m38s Feb 13 08:27:34.163322 locksmithd[1603]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Feb 13 08:27:34.163322 locksmithd[1603]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Feb 13 08:27:34.283961 systemd[1]: Started sshd@73-145.40.67.79:22-160.251.212.122:33538.service. Feb 13 08:27:34.283000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.67.79:22-160.251.212.122:33538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:34.792929 sshd[9806]: Invalid user test from 160.251.212.122 port 33538 Feb 13 08:27:34.919852 sshd[9806]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:34.920876 sshd[9806]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:34.920968 sshd[9806]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:34.922088 sshd[9806]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:34.921000 audit[9806]: USER_AUTH pid=9806 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="test" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:37.028073 sshd[9806]: Failed password for invalid user test from 160.251.212.122 port 33538 ssh2 Feb 13 08:27:37.585730 systemd[1]: Started sshd@74-145.40.67.79:22-139.178.68.195:57126.service. Feb 13 08:27:37.585000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.67.79:22-139.178.68.195:57126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:37.613330 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:27:37.613392 kernel: audit: type=1130 audit(1707812857.585:853): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.67.79:22-139.178.68.195:57126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:37.634116 sshd[9808]: Accepted publickey for core from 139.178.68.195 port 57126 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:37.635305 sshd[9808]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:37.637903 systemd-logind[1548]: New session 64 of user core. Feb 13 08:27:37.638507 systemd[1]: Started session-64.scope. Feb 13 08:27:37.632000 audit[9808]: USER_ACCT pid=9808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.720036 sshd[9808]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:37.721363 systemd[1]: sshd@74-145.40.67.79:22-139.178.68.195:57126.service: Deactivated successfully. Feb 13 08:27:37.721915 systemd-logind[1548]: Session 64 logged out. Waiting for processes to exit. Feb 13 08:27:37.721924 systemd[1]: session-64.scope: Deactivated successfully. Feb 13 08:27:37.722509 systemd-logind[1548]: Removed session 64. Feb 13 08:27:37.796058 kernel: audit: type=1101 audit(1707812857.632:854): pid=9808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.796095 kernel: audit: type=1103 audit(1707812857.634:855): pid=9808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.634000 audit[9808]: CRED_ACQ pid=9808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.944952 kernel: audit: type=1006 audit(1707812857.634:856): pid=9808 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Feb 13 08:27:37.944983 kernel: audit: type=1300 audit(1707812857.634:856): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcfb517940 a2=3 a3=0 items=0 ppid=1 pid=9808 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:37.634000 audit[9808]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcfb517940 a2=3 a3=0 items=0 ppid=1 pid=9808 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:38.037277 kernel: audit: type=1327 audit(1707812857.634:856): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:37.634000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:38.067864 kernel: audit: type=1105 audit(1707812857.639:857): pid=9808 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.639000 audit[9808]: USER_START pid=9808 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.640000 audit[9811]: CRED_ACQ pid=9811 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:38.163083 kernel: audit: type=1103 audit(1707812857.640:858): pid=9811 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.719000 audit[9808]: USER_END pid=9808 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:38.347261 kernel: audit: type=1106 audit(1707812857.719:859): pid=9808 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:38.347289 kernel: audit: type=1104 audit(1707812857.719:860): pid=9808 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:37.719000 audit[9808]: CRED_DISP pid=9808 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:38.347351 sshd[9806]: Connection closed by invalid user test 160.251.212.122 port 33538 [preauth] Feb 13 08:27:38.348009 systemd[1]: sshd@73-145.40.67.79:22-160.251.212.122:33538.service: Deactivated successfully. Feb 13 08:27:37.720000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.67.79:22-139.178.68.195:57126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:38.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.67.79:22-160.251.212.122:33538 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:38.472924 systemd[1]: Started sshd@75-145.40.67.79:22-160.251.212.122:33542.service. Feb 13 08:27:38.472000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.67.79:22-160.251.212.122:33542 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:39.108045 sshd[9836]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:27:39.107000 audit[9836]: USER_AUTH pid=9836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:40.899030 sshd[9836]: Failed password for root from 160.251.212.122 port 33542 ssh2 Feb 13 08:27:41.151964 sshd[9836]: Connection closed by authenticating user root 160.251.212.122 port 33542 [preauth] Feb 13 08:27:41.154483 systemd[1]: sshd@75-145.40.67.79:22-160.251.212.122:33542.service: Deactivated successfully. Feb 13 08:27:41.154000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.67.79:22-160.251.212.122:33542 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:41.280787 systemd[1]: Started sshd@76-145.40.67.79:22-160.251.212.122:33558.service. Feb 13 08:27:41.280000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.67.79:22-160.251.212.122:33558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:41.915969 sshd[9842]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:27:41.915000 audit[9842]: ANOM_LOGIN_FAILURES pid=9842 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:41.915000 audit[9842]: USER_AUTH pid=9842 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:41.916254 sshd[9842]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:27:42.726472 systemd[1]: Started sshd@77-145.40.67.79:22-139.178.68.195:57140.service. Feb 13 08:27:42.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.67.79:22-139.178.68.195:57140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:42.753189 kernel: kauditd_printk_skb: 8 callbacks suppressed Feb 13 08:27:42.753333 kernel: audit: type=1130 audit(1707812862.725:869): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.67.79:22-139.178.68.195:57140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:42.773524 sshd[9844]: Accepted publickey for core from 139.178.68.195 port 57140 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:42.774524 sshd[9844]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:42.776926 systemd-logind[1548]: New session 65 of user core. Feb 13 08:27:42.777563 systemd[1]: Started session-65.scope. Feb 13 08:27:42.772000 audit[9844]: USER_ACCT pid=9844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.855016 sshd[9844]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:42.856495 systemd[1]: sshd@77-145.40.67.79:22-139.178.68.195:57140.service: Deactivated successfully. Feb 13 08:27:42.857090 systemd[1]: session-65.scope: Deactivated successfully. Feb 13 08:27:42.857153 systemd-logind[1548]: Session 65 logged out. Waiting for processes to exit. Feb 13 08:27:42.857597 systemd-logind[1548]: Removed session 65. Feb 13 08:27:42.933070 kernel: audit: type=1101 audit(1707812862.772:870): pid=9844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.933112 kernel: audit: type=1103 audit(1707812862.773:871): pid=9844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.773000 audit[9844]: CRED_ACQ pid=9844 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:43.081882 kernel: audit: type=1006 audit(1707812862.773:872): pid=9844 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Feb 13 08:27:43.081916 kernel: audit: type=1300 audit(1707812862.773:872): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff64f84360 a2=3 a3=0 items=0 ppid=1 pid=9844 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:42.773000 audit[9844]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff64f84360 a2=3 a3=0 items=0 ppid=1 pid=9844 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:43.173754 kernel: audit: type=1327 audit(1707812862.773:872): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:42.773000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:43.204146 kernel: audit: type=1105 audit(1707812862.778:873): pid=9844 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.778000 audit[9844]: USER_START pid=9844 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:43.298436 kernel: audit: type=1103 audit(1707812862.779:874): pid=9848 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.779000 audit[9848]: CRED_ACQ pid=9848 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:43.387397 kernel: audit: type=1106 audit(1707812862.854:875): pid=9844 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.854000 audit[9844]: USER_END pid=9844 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:43.482681 kernel: audit: type=1104 audit(1707812862.854:876): pid=9844 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.854000 audit[9844]: CRED_DISP pid=9844 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:42.855000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.67.79:22-139.178.68.195:57140 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:43.983178 sshd[9842]: Failed password for root from 160.251.212.122 port 33558 ssh2 Feb 13 08:27:45.881646 sshd[9842]: Connection closed by authenticating user root 160.251.212.122 port 33558 [preauth] Feb 13 08:27:45.884341 systemd[1]: sshd@76-145.40.67.79:22-160.251.212.122:33558.service: Deactivated successfully. Feb 13 08:27:45.884000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.67.79:22-160.251.212.122:33558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:46.010381 systemd[1]: Started sshd@78-145.40.67.79:22-160.251.212.122:40738.service. Feb 13 08:27:46.009000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.67.79:22-160.251.212.122:40738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:46.519338 sshd[9873]: Invalid user postgres from 160.251.212.122 port 40738 Feb 13 08:27:46.646650 sshd[9873]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:46.646904 sshd[9873]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:46.646926 sshd[9873]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:46.647187 sshd[9873]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:46.646000 audit[9873]: USER_AUTH pid=9873 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="postgres" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:47.863585 systemd[1]: Started sshd@79-145.40.67.79:22-139.178.68.195:59418.service. Feb 13 08:27:47.863000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.67.79:22-139.178.68.195:59418 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:47.891068 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:27:47.891141 kernel: audit: type=1130 audit(1707812867.863:881): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.67.79:22-139.178.68.195:59418 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:47.911369 sshd[9875]: Accepted publickey for core from 139.178.68.195 port 59418 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:47.913323 sshd[9875]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:47.915712 systemd-logind[1548]: New session 66 of user core. Feb 13 08:27:47.916366 systemd[1]: Started session-66.scope. Feb 13 08:27:47.910000 audit[9875]: USER_ACCT pid=9875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.996962 sshd[9875]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:47.998570 systemd[1]: sshd@79-145.40.67.79:22-139.178.68.195:59418.service: Deactivated successfully. Feb 13 08:27:47.999245 systemd[1]: session-66.scope: Deactivated successfully. Feb 13 08:27:47.999269 systemd-logind[1548]: Session 66 logged out. Waiting for processes to exit. Feb 13 08:27:47.999802 systemd-logind[1548]: Removed session 66. Feb 13 08:27:48.072170 kernel: audit: type=1101 audit(1707812867.910:882): pid=9875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:48.072198 kernel: audit: type=1103 audit(1707812867.912:883): pid=9875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.912000 audit[9875]: CRED_ACQ pid=9875 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:48.220851 kernel: audit: type=1006 audit(1707812867.912:884): pid=9875 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Feb 13 08:27:48.220884 kernel: audit: type=1300 audit(1707812867.912:884): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdded84580 a2=3 a3=0 items=0 ppid=1 pid=9875 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:47.912000 audit[9875]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdded84580 a2=3 a3=0 items=0 ppid=1 pid=9875 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:48.312694 kernel: audit: type=1327 audit(1707812867.912:884): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:47.912000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:48.343069 kernel: audit: type=1105 audit(1707812867.917:885): pid=9875 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.917000 audit[9875]: USER_START pid=9875 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:48.437378 kernel: audit: type=1103 audit(1707812867.918:886): pid=9878 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.918000 audit[9878]: CRED_ACQ pid=9878 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:48.526430 kernel: audit: type=1106 audit(1707812867.996:887): pid=9875 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.996000 audit[9875]: USER_END pid=9875 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:48.621752 kernel: audit: type=1104 audit(1707812867.996:888): pid=9875 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.996000 audit[9875]: CRED_DISP pid=9875 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:47.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.67.79:22-139.178.68.195:59418 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:48.733706 sshd[9873]: Failed password for invalid user postgres from 160.251.212.122 port 40738 ssh2 Feb 13 08:27:50.897191 sshd[9873]: Connection closed by invalid user postgres 160.251.212.122 port 40738 [preauth] Feb 13 08:27:50.899706 systemd[1]: sshd@78-145.40.67.79:22-160.251.212.122:40738.service: Deactivated successfully. Feb 13 08:27:50.899000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.67.79:22-160.251.212.122:40738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:51.023969 systemd[1]: Started sshd@80-145.40.67.79:22-160.251.212.122:40754.service. Feb 13 08:27:51.023000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.67.79:22-160.251.212.122:40754 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:51.529098 sshd[9903]: Invalid user admin from 160.251.212.122 port 40754 Feb 13 08:27:51.654897 sshd[9903]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:51.655272 sshd[9903]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:27:51.655303 sshd[9903]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:27:51.655599 sshd[9903]: pam_faillock(sshd:auth): User unknown Feb 13 08:27:51.654000 audit[9903]: USER_AUTH pid=9903 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="admin" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:53.004631 systemd[1]: Started sshd@81-145.40.67.79:22-139.178.68.195:59422.service. Feb 13 08:27:53.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.67.79:22-139.178.68.195:59422 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:53.032079 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:27:53.032185 kernel: audit: type=1130 audit(1707812873.004:893): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.67.79:22-139.178.68.195:59422 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:53.052612 sshd[9906]: Accepted publickey for core from 139.178.68.195 port 59422 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:53.053314 sshd[9906]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:53.055959 systemd-logind[1548]: New session 67 of user core. Feb 13 08:27:53.056629 systemd[1]: Started session-67.scope. Feb 13 08:27:53.051000 audit[9906]: USER_ACCT pid=9906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.122099 kernel: audit: type=1101 audit(1707812873.051:894): pid=9906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.134947 sshd[9906]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:53.136418 systemd[1]: sshd@81-145.40.67.79:22-139.178.68.195:59422.service: Deactivated successfully. Feb 13 08:27:53.136960 systemd-logind[1548]: Session 67 logged out. Waiting for processes to exit. Feb 13 08:27:53.137024 systemd[1]: session-67.scope: Deactivated successfully. Feb 13 08:27:53.137630 systemd-logind[1548]: Removed session 67. Feb 13 08:27:53.052000 audit[9906]: CRED_ACQ pid=9906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.303653 kernel: audit: type=1103 audit(1707812873.052:895): pid=9906 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.303692 kernel: audit: type=1006 audit(1707812873.052:896): pid=9906 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=67 res=1 Feb 13 08:27:53.052000 audit[9906]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3a842e90 a2=3 a3=0 items=0 ppid=1 pid=9906 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:53.454003 kernel: audit: type=1300 audit(1707812873.052:896): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3a842e90 a2=3 a3=0 items=0 ppid=1 pid=9906 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:53.454032 kernel: audit: type=1327 audit(1707812873.052:896): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:53.052000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:53.484418 kernel: audit: type=1105 audit(1707812873.057:897): pid=9906 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.057000 audit[9906]: USER_START pid=9906 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.578735 kernel: audit: type=1103 audit(1707812873.058:898): pid=9910 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.058000 audit[9910]: CRED_ACQ pid=9910 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.134000 audit[9906]: USER_END pid=9906 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.761139 sshd[9903]: Failed password for invalid user admin from 160.251.212.122 port 40754 ssh2 Feb 13 08:27:53.763183 kernel: audit: type=1106 audit(1707812873.134:899): pid=9906 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.763223 kernel: audit: type=1104 audit(1707812873.134:900): pid=9906 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.134000 audit[9906]: CRED_DISP pid=9906 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:53.135000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.67.79:22-139.178.68.195:59422 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:55.542299 sshd[9903]: Connection closed by invalid user admin 160.251.212.122 port 40754 [preauth] Feb 13 08:27:55.544724 systemd[1]: sshd@80-145.40.67.79:22-160.251.212.122:40754.service: Deactivated successfully. Feb 13 08:27:55.544000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.67.79:22-160.251.212.122:40754 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:55.673367 systemd[1]: Started sshd@82-145.40.67.79:22-160.251.212.122:44634.service. Feb 13 08:27:55.672000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.67.79:22-160.251.212.122:44634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:56.314118 sshd[9983]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:27:56.313000 audit[9983]: USER_AUTH pid=9983 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:27:58.141158 systemd[1]: Started sshd@83-145.40.67.79:22-139.178.68.195:35040.service. Feb 13 08:27:58.140000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.67.79:22-139.178.68.195:35040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:58.168222 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:27:58.168302 kernel: audit: type=1130 audit(1707812878.140:905): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.67.79:22-139.178.68.195:35040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:27:58.188647 sshd[9985]: Accepted publickey for core from 139.178.68.195 port 35040 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:27:58.190308 sshd[9985]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:27:58.193640 systemd-logind[1548]: New session 68 of user core. Feb 13 08:27:58.194109 systemd[1]: Started session-68.scope. Feb 13 08:27:58.187000 audit[9985]: USER_ACCT pid=9985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.276048 sshd[9985]: pam_unix(sshd:session): session closed for user core Feb 13 08:27:58.277586 systemd[1]: sshd@83-145.40.67.79:22-139.178.68.195:35040.service: Deactivated successfully. Feb 13 08:27:58.278293 systemd[1]: session-68.scope: Deactivated successfully. Feb 13 08:27:58.278299 systemd-logind[1548]: Session 68 logged out. Waiting for processes to exit. Feb 13 08:27:58.278887 systemd-logind[1548]: Removed session 68. Feb 13 08:27:58.350773 kernel: audit: type=1101 audit(1707812878.187:906): pid=9985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.350811 kernel: audit: type=1103 audit(1707812878.189:907): pid=9985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.189000 audit[9985]: CRED_ACQ pid=9985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.440686 sshd[9983]: Failed password for root from 160.251.212.122 port 44634 ssh2 Feb 13 08:27:58.441138 kernel: audit: type=1006 audit(1707812878.189:908): pid=9985 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=68 res=1 Feb 13 08:27:58.189000 audit[9985]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5b778e70 a2=3 a3=0 items=0 ppid=1 pid=9985 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:58.591505 kernel: audit: type=1300 audit(1707812878.189:908): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5b778e70 a2=3 a3=0 items=0 ppid=1 pid=9985 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:27:58.591537 kernel: audit: type=1327 audit(1707812878.189:908): proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:58.189000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:27:58.621926 kernel: audit: type=1105 audit(1707812878.197:909): pid=9985 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.197000 audit[9985]: USER_START pid=9985 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.199000 audit[9988]: CRED_ACQ pid=9988 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.805401 kernel: audit: type=1103 audit(1707812878.199:910): pid=9988 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.805435 kernel: audit: type=1106 audit(1707812878.275:911): pid=9985 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.275000 audit[9985]: USER_END pid=9985 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.900810 kernel: audit: type=1104 audit(1707812878.275:912): pid=9985 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.275000 audit[9985]: CRED_DISP pid=9985 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:27:58.276000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.67.79:22-139.178.68.195:35040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:00.280951 sshd[9983]: Connection closed by authenticating user root 160.251.212.122 port 44634 [preauth] Feb 13 08:28:00.283420 systemd[1]: sshd@82-145.40.67.79:22-160.251.212.122:44634.service: Deactivated successfully. Feb 13 08:28:00.283000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.67.79:22-160.251.212.122:44634 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:00.408654 systemd[1]: Started sshd@84-145.40.67.79:22-160.251.212.122:44638.service. Feb 13 08:28:00.407000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.67.79:22-160.251.212.122:44638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:01.042313 sshd[10013]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:28:01.041000 audit[10013]: USER_AUTH pid=10013 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:03.189113 sshd[10013]: Failed password for root from 160.251.212.122 port 44638 ssh2 Feb 13 08:28:03.282739 systemd[1]: Started sshd@85-145.40.67.79:22-139.178.68.195:35042.service. Feb 13 08:28:03.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.67.79:22-139.178.68.195:35042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:03.309548 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:28:03.309592 kernel: audit: type=1130 audit(1707812883.281:917): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.67.79:22-139.178.68.195:35042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:03.419000 audit[10044]: USER_ACCT pid=10044 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.420477 sshd[10044]: Accepted publickey for core from 139.178.68.195 port 35042 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:03.421666 sshd[10044]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:03.423906 systemd-logind[1548]: New session 69 of user core. Feb 13 08:28:03.424530 systemd[1]: Started session-69.scope. Feb 13 08:28:03.505060 sshd[10044]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:03.506551 systemd[1]: sshd@85-145.40.67.79:22-139.178.68.195:35042.service: Deactivated successfully. Feb 13 08:28:03.507239 systemd-logind[1548]: Session 69 logged out. Waiting for processes to exit. Feb 13 08:28:03.507240 systemd[1]: session-69.scope: Deactivated successfully. Feb 13 08:28:03.507779 systemd-logind[1548]: Removed session 69. Feb 13 08:28:03.420000 audit[10044]: CRED_ACQ pid=10044 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.603918 kernel: audit: type=1101 audit(1707812883.419:918): pid=10044 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.603953 kernel: audit: type=1103 audit(1707812883.420:919): pid=10044 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.603968 kernel: audit: type=1006 audit(1707812883.420:920): pid=10044 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=69 res=1 Feb 13 08:28:03.662495 kernel: audit: type=1300 audit(1707812883.420:920): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb644e8b0 a2=3 a3=0 items=0 ppid=1 pid=10044 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:03.420000 audit[10044]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb644e8b0 a2=3 a3=0 items=0 ppid=1 pid=10044 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:03.754516 kernel: audit: type=1327 audit(1707812883.420:920): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:03.420000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:03.784879 kernel: audit: type=1105 audit(1707812883.425:921): pid=10044 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.425000 audit[10044]: USER_START pid=10044 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.879300 kernel: audit: type=1103 audit(1707812883.426:922): pid=10047 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.426000 audit[10047]: CRED_ACQ pid=10047 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.968425 kernel: audit: type=1106 audit(1707812883.504:923): pid=10044 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.504000 audit[10044]: USER_END pid=10044 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:04.063822 kernel: audit: type=1104 audit(1707812883.504:924): pid=10044 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.504000 audit[10044]: CRED_DISP pid=10044 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:03.505000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.67.79:22-139.178.68.195:35042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:05.008517 sshd[10013]: Connection closed by authenticating user root 160.251.212.122 port 44638 [preauth] Feb 13 08:28:05.011612 systemd[1]: sshd@84-145.40.67.79:22-160.251.212.122:44638.service: Deactivated successfully. Feb 13 08:28:05.011000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.67.79:22-160.251.212.122:44638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:05.135322 systemd[1]: Started sshd@86-145.40.67.79:22-160.251.212.122:41810.service. Feb 13 08:28:05.134000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.67.79:22-160.251.212.122:41810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:05.637290 sshd[10073]: Invalid user steam from 160.251.212.122 port 41810 Feb 13 08:28:05.772843 sshd[10073]: pam_faillock(sshd:auth): User unknown Feb 13 08:28:05.774007 sshd[10073]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:28:05.774104 sshd[10073]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:28:05.775125 sshd[10073]: pam_faillock(sshd:auth): User unknown Feb 13 08:28:05.774000 audit[10073]: USER_AUTH pid=10073 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="steam" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:07.470266 sshd[10073]: Failed password for invalid user steam from 160.251.212.122 port 41810 ssh2 Feb 13 08:28:08.063703 sshd[10073]: Connection closed by invalid user steam 160.251.212.122 port 41810 [preauth] Feb 13 08:28:08.066148 systemd[1]: sshd@86-145.40.67.79:22-160.251.212.122:41810.service: Deactivated successfully. Feb 13 08:28:08.065000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.67.79:22-160.251.212.122:41810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:08.190919 systemd[1]: Started sshd@87-145.40.67.79:22-160.251.212.122:41816.service. Feb 13 08:28:08.190000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.67.79:22-160.251.212.122:41816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:08.513979 systemd[1]: Started sshd@88-145.40.67.79:22-139.178.68.195:53146.service. Feb 13 08:28:08.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.67.79:22-139.178.68.195:53146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:08.541788 kernel: kauditd_printk_skb: 6 callbacks suppressed Feb 13 08:28:08.541866 kernel: audit: type=1130 audit(1707812888.513:931): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.67.79:22-139.178.68.195:53146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:08.562171 sshd[10079]: Accepted publickey for core from 139.178.68.195 port 53146 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:08.563735 sshd[10079]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:08.566504 systemd-logind[1548]: New session 70 of user core. Feb 13 08:28:08.567133 systemd[1]: Started session-70.scope. Feb 13 08:28:08.561000 audit[10079]: USER_ACCT pid=10079 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.645761 sshd[10079]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:08.647068 systemd[1]: sshd@88-145.40.67.79:22-139.178.68.195:53146.service: Deactivated successfully. Feb 13 08:28:08.647671 systemd-logind[1548]: Session 70 logged out. Waiting for processes to exit. Feb 13 08:28:08.647706 systemd[1]: session-70.scope: Deactivated successfully. Feb 13 08:28:08.648180 systemd-logind[1548]: Removed session 70. Feb 13 08:28:08.722079 kernel: audit: type=1101 audit(1707812888.561:932): pid=10079 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.722126 kernel: audit: type=1103 audit(1707812888.562:933): pid=10079 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.562000 audit[10079]: CRED_ACQ pid=10079 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.812513 kernel: audit: type=1006 audit(1707812888.562:934): pid=10079 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=70 res=1 Feb 13 08:28:08.820996 sshd[10077]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:28:08.871067 kernel: audit: type=1300 audit(1707812888.562:934): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcbbb1b3f0 a2=3 a3=0 items=0 ppid=1 pid=10079 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:08.562000 audit[10079]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcbbb1b3f0 a2=3 a3=0 items=0 ppid=1 pid=10079 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:08.562000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:08.993260 kernel: audit: type=1327 audit(1707812888.562:934): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:08.993291 kernel: audit: type=1105 audit(1707812888.568:935): pid=10079 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.568000 audit[10079]: USER_START pid=10079 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:09.087673 kernel: audit: type=1103 audit(1707812888.569:936): pid=10082 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.569000 audit[10082]: CRED_ACQ pid=10082 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.645000 audit[10079]: USER_END pid=10079 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:09.272070 kernel: audit: type=1106 audit(1707812888.645:937): pid=10079 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:09.272100 kernel: audit: type=1104 audit(1707812888.645:938): pid=10079 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.645000 audit[10079]: CRED_DISP pid=10079 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:08.646000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.67.79:22-139.178.68.195:53146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:08.820000 audit[10077]: USER_AUTH pid=10077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:10.927219 sshd[10077]: Failed password for root from 160.251.212.122 port 41816 ssh2 Feb 13 08:28:12.786882 sshd[10077]: Connection closed by authenticating user root 160.251.212.122 port 41816 [preauth] Feb 13 08:28:12.789513 systemd[1]: sshd@87-145.40.67.79:22-160.251.212.122:41816.service: Deactivated successfully. Feb 13 08:28:12.789000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.67.79:22-160.251.212.122:41816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:12.914107 systemd[1]: Started sshd@89-145.40.67.79:22-160.251.212.122:41822.service. Feb 13 08:28:12.913000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.67.79:22-160.251.212.122:41822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:13.555327 sshd[10107]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:28:13.554000 audit[10107]: USER_AUTH pid=10107 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:13.582924 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:28:13.582999 kernel: audit: type=1100 audit(1707812893.554:943): pid=10107 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:13.648467 systemd[1]: Started sshd@90-145.40.67.79:22-139.178.68.195:53148.service. Feb 13 08:28:13.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.67.79:22-139.178.68.195:53148 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:13.693312 sshd[10110]: Accepted publickey for core from 139.178.68.195 port 53148 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:13.694500 sshd[10110]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:13.696989 systemd-logind[1548]: New session 71 of user core. Feb 13 08:28:13.697745 systemd[1]: Started session-71.scope. Feb 13 08:28:13.761142 kernel: audit: type=1130 audit(1707812893.647:944): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.67.79:22-139.178.68.195:53148 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:13.761212 kernel: audit: type=1101 audit(1707812893.692:945): pid=10110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.692000 audit[10110]: USER_ACCT pid=10110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.775915 sshd[10110]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:13.777406 systemd[1]: sshd@90-145.40.67.79:22-139.178.68.195:53148.service: Deactivated successfully. Feb 13 08:28:13.777987 systemd-logind[1548]: Session 71 logged out. Waiting for processes to exit. Feb 13 08:28:13.778036 systemd[1]: session-71.scope: Deactivated successfully. Feb 13 08:28:13.778676 systemd-logind[1548]: Removed session 71. Feb 13 08:28:13.853007 kernel: audit: type=1103 audit(1707812893.693:946): pid=10110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.693000 audit[10110]: CRED_ACQ pid=10110 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.943368 kernel: audit: type=1006 audit(1707812893.693:947): pid=10110 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=71 res=1 Feb 13 08:28:14.001856 kernel: audit: type=1300 audit(1707812893.693:947): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc5c55a20 a2=3 a3=0 items=0 ppid=1 pid=10110 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:13.693000 audit[10110]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc5c55a20 a2=3 a3=0 items=0 ppid=1 pid=10110 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:14.093664 kernel: audit: type=1327 audit(1707812893.693:947): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:13.693000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:14.124013 kernel: audit: type=1105 audit(1707812893.699:948): pid=10110 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.699000 audit[10110]: USER_START pid=10110 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:14.219185 kernel: audit: type=1103 audit(1707812893.699:949): pid=10113 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.699000 audit[10113]: CRED_ACQ pid=10113 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:14.308298 kernel: audit: type=1106 audit(1707812893.775:950): pid=10110 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.775000 audit[10110]: USER_END pid=10110 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.775000 audit[10110]: CRED_DISP pid=10110 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:13.776000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.67.79:22-139.178.68.195:53148 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:15.682217 sshd[10107]: Failed password for root from 160.251.212.122 port 41822 ssh2 Feb 13 08:28:17.520563 sshd[10107]: Connection closed by authenticating user root 160.251.212.122 port 41822 [preauth] Feb 13 08:28:17.523227 systemd[1]: sshd@89-145.40.67.79:22-160.251.212.122:41822.service: Deactivated successfully. Feb 13 08:28:17.522000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.67.79:22-160.251.212.122:41822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:17.648881 systemd[1]: Started sshd@91-145.40.67.79:22-160.251.212.122:42020.service. Feb 13 08:28:17.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.67.79:22-160.251.212.122:42020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:18.293411 sshd[10139]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:28:18.292000 audit[10139]: USER_AUTH pid=10139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:18.782011 systemd[1]: Started sshd@92-145.40.67.79:22-139.178.68.195:37926.service. Feb 13 08:28:18.781000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.67.79:22-139.178.68.195:37926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:18.808914 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:28:18.809000 kernel: audit: type=1130 audit(1707812898.781:956): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.67.79:22-139.178.68.195:37926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:18.829422 sshd[10141]: Accepted publickey for core from 139.178.68.195 port 37926 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:18.831875 sshd[10141]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:18.834397 systemd-logind[1548]: New session 72 of user core. Feb 13 08:28:18.834946 systemd[1]: Started session-72.scope. Feb 13 08:28:18.828000 audit[10141]: USER_ACCT pid=10141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.915491 sshd[10141]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:18.916836 systemd[1]: sshd@92-145.40.67.79:22-139.178.68.195:37926.service: Deactivated successfully. Feb 13 08:28:18.917512 systemd[1]: session-72.scope: Deactivated successfully. Feb 13 08:28:18.917527 systemd-logind[1548]: Session 72 logged out. Waiting for processes to exit. Feb 13 08:28:18.917956 systemd-logind[1548]: Removed session 72. Feb 13 08:28:18.991080 kernel: audit: type=1101 audit(1707812898.828:957): pid=10141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.991124 kernel: audit: type=1103 audit(1707812898.830:958): pid=10141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.830000 audit[10141]: CRED_ACQ pid=10141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:19.081905 kernel: audit: type=1006 audit(1707812898.830:959): pid=10141 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=72 res=1 Feb 13 08:28:19.140668 kernel: audit: type=1300 audit(1707812898.830:959): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff560b8860 a2=3 a3=0 items=0 ppid=1 pid=10141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:18.830000 audit[10141]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff560b8860 a2=3 a3=0 items=0 ppid=1 pid=10141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:19.233015 kernel: audit: type=1327 audit(1707812898.830:959): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:18.830000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:19.263504 kernel: audit: type=1105 audit(1707812898.836:960): pid=10141 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.836000 audit[10141]: USER_START pid=10141 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:19.358240 kernel: audit: type=1103 audit(1707812898.837:961): pid=10144 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.837000 audit[10144]: CRED_ACQ pid=10144 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:19.447709 kernel: audit: type=1106 audit(1707812898.915:962): pid=10141 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.915000 audit[10141]: USER_END pid=10141 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:19.543418 kernel: audit: type=1104 audit(1707812898.915:963): pid=10141 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.915000 audit[10141]: CRED_DISP pid=10141 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:18.915000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.67.79:22-139.178.68.195:37926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:20.440404 sshd[10139]: Failed password for root from 160.251.212.122 port 42020 ssh2 Feb 13 08:28:22.259455 sshd[10139]: Connection closed by authenticating user root 160.251.212.122 port 42020 [preauth] Feb 13 08:28:22.261960 systemd[1]: sshd@91-145.40.67.79:22-160.251.212.122:42020.service: Deactivated successfully. Feb 13 08:28:22.261000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.67.79:22-160.251.212.122:42020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:22.388723 systemd[1]: Started sshd@93-145.40.67.79:22-160.251.212.122:42026.service. Feb 13 08:28:22.387000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.67.79:22-160.251.212.122:42026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:22.897898 sshd[10169]: Invalid user craft from 160.251.212.122 port 42026 Feb 13 08:28:23.027056 sshd[10169]: pam_faillock(sshd:auth): User unknown Feb 13 08:28:23.028081 sshd[10169]: pam_unix(sshd:auth): check pass; user unknown Feb 13 08:28:23.028178 sshd[10169]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 Feb 13 08:28:23.029181 sshd[10169]: pam_faillock(sshd:auth): User unknown Feb 13 08:28:23.028000 audit[10169]: USER_AUTH pid=10169 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="craft" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:23.921675 systemd[1]: Started sshd@94-145.40.67.79:22-139.178.68.195:37938.service. Feb 13 08:28:23.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.67.79:22-139.178.68.195:37938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:23.948454 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:28:23.948510 kernel: audit: type=1130 audit(1707812903.920:968): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.67.79:22-139.178.68.195:37938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:23.968921 sshd[10172]: Accepted publickey for core from 139.178.68.195 port 37938 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:23.970352 sshd[10172]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:23.972987 systemd-logind[1548]: New session 73 of user core. Feb 13 08:28:23.973759 systemd[1]: Started session-73.scope. Feb 13 08:28:23.967000 audit[10172]: USER_ACCT pid=10172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.038038 kernel: audit: type=1101 audit(1707812903.967:969): pid=10172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.051161 sshd[10172]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:24.052592 systemd[1]: sshd@94-145.40.67.79:22-139.178.68.195:37938.service: Deactivated successfully. Feb 13 08:28:24.053242 systemd[1]: session-73.scope: Deactivated successfully. Feb 13 08:28:24.053289 systemd-logind[1548]: Session 73 logged out. Waiting for processes to exit. Feb 13 08:28:24.053817 systemd-logind[1548]: Removed session 73. Feb 13 08:28:23.969000 audit[10172]: CRED_ACQ pid=10172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.219137 kernel: audit: type=1103 audit(1707812903.969:970): pid=10172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.219193 kernel: audit: type=1006 audit(1707812903.969:971): pid=10172 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=73 res=1 Feb 13 08:28:24.277989 kernel: audit: type=1300 audit(1707812903.969:971): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6057ea50 a2=3 a3=0 items=0 ppid=1 pid=10172 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:23.969000 audit[10172]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6057ea50 a2=3 a3=0 items=0 ppid=1 pid=10172 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:24.370238 kernel: audit: type=1327 audit(1707812903.969:971): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:23.969000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:24.400770 kernel: audit: type=1105 audit(1707812903.974:972): pid=10172 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:23.974000 audit[10172]: USER_START pid=10172 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.495564 kernel: audit: type=1103 audit(1707812903.975:973): pid=10175 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:23.975000 audit[10175]: CRED_ACQ pid=10175 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.584557 kernel: audit: type=1106 audit(1707812904.050:974): pid=10172 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.050000 audit[10172]: USER_END pid=10172 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.679888 kernel: audit: type=1104 audit(1707812904.050:975): pid=10172 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.050000 audit[10172]: CRED_DISP pid=10172 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:24.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.67.79:22-139.178.68.195:37938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:25.195116 sshd[10169]: Failed password for invalid user craft from 160.251.212.122 port 42026 ssh2 Feb 13 08:28:25.337200 sshd[10169]: Connection closed by invalid user craft 160.251.212.122 port 42026 [preauth] Feb 13 08:28:25.339765 systemd[1]: sshd@93-145.40.67.79:22-160.251.212.122:42026.service: Deactivated successfully. Feb 13 08:28:25.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.67.79:22-160.251.212.122:42026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:25.463590 systemd[1]: Started sshd@95-145.40.67.79:22-160.251.212.122:58056.service. Feb 13 08:28:25.462000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.67.79:22-160.251.212.122:58056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:26.099155 sshd[10276]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=160.251.212.122 user=root Feb 13 08:28:26.098000 audit[10276]: USER_AUTH pid=10276 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=160.251.212.122 addr=160.251.212.122 terminal=ssh res=failed' Feb 13 08:28:27.678213 sshd[10276]: Failed password for root from 160.251.212.122 port 58056 ssh2 Feb 13 08:28:28.142770 sshd[10276]: Connection closed by authenticating user root 160.251.212.122 port 58056 [preauth] Feb 13 08:28:28.145272 systemd[1]: sshd@95-145.40.67.79:22-160.251.212.122:58056.service: Deactivated successfully. Feb 13 08:28:28.144000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.67.79:22-160.251.212.122:58056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:29.058060 systemd[1]: Started sshd@96-145.40.67.79:22-139.178.68.195:40394.service. Feb 13 08:28:29.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.67.79:22-139.178.68.195:40394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:29.084894 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:28:29.084950 kernel: audit: type=1130 audit(1707812909.057:981): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.67.79:22-139.178.68.195:40394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:29.192000 audit[10281]: USER_ACCT pid=10281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.193975 sshd[10281]: Accepted publickey for core from 139.178.68.195 port 40394 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:29.196857 sshd[10281]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:29.201754 systemd-logind[1548]: New session 74 of user core. Feb 13 08:28:29.202170 systemd[1]: Started session-74.scope. Feb 13 08:28:29.280184 sshd[10281]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:29.281630 systemd[1]: sshd@96-145.40.67.79:22-139.178.68.195:40394.service: Deactivated successfully. Feb 13 08:28:29.282324 systemd[1]: session-74.scope: Deactivated successfully. Feb 13 08:28:29.282375 systemd-logind[1548]: Session 74 logged out. Waiting for processes to exit. Feb 13 08:28:29.282736 systemd-logind[1548]: Removed session 74. Feb 13 08:28:29.284996 kernel: audit: type=1101 audit(1707812909.192:982): pid=10281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.285022 kernel: audit: type=1103 audit(1707812909.195:983): pid=10281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.195000 audit[10281]: CRED_ACQ pid=10281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.433958 kernel: audit: type=1006 audit(1707812909.195:984): pid=10281 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=74 res=1 Feb 13 08:28:29.434010 kernel: audit: type=1300 audit(1707812909.195:984): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe172e6f70 a2=3 a3=0 items=0 ppid=1 pid=10281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:29.195000 audit[10281]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe172e6f70 a2=3 a3=0 items=0 ppid=1 pid=10281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:29.525758 kernel: audit: type=1327 audit(1707812909.195:984): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:29.195000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:29.556076 kernel: audit: type=1105 audit(1707812909.203:985): pid=10281 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.203000 audit[10281]: USER_START pid=10281 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.650425 kernel: audit: type=1103 audit(1707812909.203:986): pid=10284 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.203000 audit[10284]: CRED_ACQ pid=10284 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.739469 kernel: audit: type=1106 audit(1707812909.279:987): pid=10281 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.279000 audit[10281]: USER_END pid=10281 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.834874 kernel: audit: type=1104 audit(1707812909.279:988): pid=10281 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.279000 audit[10281]: CRED_DISP pid=10281 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:29.280000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.67.79:22-139.178.68.195:40394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:34.287313 systemd[1]: Started sshd@97-145.40.67.79:22-139.178.68.195:40408.service. Feb 13 08:28:34.286000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.67.79:22-139.178.68.195:40408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:34.314479 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:28:34.314552 kernel: audit: type=1130 audit(1707812914.286:990): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.67.79:22-139.178.68.195:40408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:34.424000 audit[10332]: USER_ACCT pid=10332 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.425400 sshd[10332]: Accepted publickey for core from 139.178.68.195 port 40408 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:34.426664 sshd[10332]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:34.429017 systemd-logind[1548]: New session 75 of user core. Feb 13 08:28:34.429460 systemd[1]: Started session-75.scope. Feb 13 08:28:34.506475 sshd[10332]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:34.507734 systemd[1]: sshd@97-145.40.67.79:22-139.178.68.195:40408.service: Deactivated successfully. Feb 13 08:28:34.508404 systemd[1]: session-75.scope: Deactivated successfully. Feb 13 08:28:34.508414 systemd-logind[1548]: Session 75 logged out. Waiting for processes to exit. Feb 13 08:28:34.508890 systemd-logind[1548]: Removed session 75. Feb 13 08:28:34.425000 audit[10332]: CRED_ACQ pid=10332 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.606953 kernel: audit: type=1101 audit(1707812914.424:991): pid=10332 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.607004 kernel: audit: type=1103 audit(1707812914.425:992): pid=10332 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.607026 kernel: audit: type=1006 audit(1707812914.425:993): pid=10332 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=75 res=1 Feb 13 08:28:34.665437 kernel: audit: type=1300 audit(1707812914.425:993): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8d9ee080 a2=3 a3=0 items=0 ppid=1 pid=10332 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:34.425000 audit[10332]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8d9ee080 a2=3 a3=0 items=0 ppid=1 pid=10332 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:34.757339 kernel: audit: type=1327 audit(1707812914.425:993): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:34.425000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:34.787678 kernel: audit: type=1105 audit(1707812914.430:994): pid=10332 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.430000 audit[10332]: USER_START pid=10332 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.881997 kernel: audit: type=1103 audit(1707812914.430:995): pid=10335 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.430000 audit[10335]: CRED_ACQ pid=10335 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.971015 kernel: audit: type=1106 audit(1707812914.506:996): pid=10332 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.506000 audit[10332]: USER_END pid=10332 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:35.066359 kernel: audit: type=1104 audit(1707812914.506:997): pid=10332 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.506000 audit[10332]: CRED_DISP pid=10332 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:34.506000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.67.79:22-139.178.68.195:40408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.513217 systemd[1]: Started sshd@98-145.40.67.79:22-139.178.68.195:60428.service. Feb 13 08:28:39.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.67.79:22-139.178.68.195:60428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.540198 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:28:39.540313 kernel: audit: type=1130 audit(1707812919.512:999): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.67.79:22-139.178.68.195:60428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.649000 audit[10360]: USER_ACCT pid=10360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.650238 sshd[10360]: Accepted publickey for core from 139.178.68.195 port 60428 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:39.651333 sshd[10360]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:39.653865 systemd-logind[1548]: New session 76 of user core. Feb 13 08:28:39.654277 systemd[1]: Started session-76.scope. Feb 13 08:28:39.731745 sshd[10360]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:39.733213 systemd[1]: sshd@98-145.40.67.79:22-139.178.68.195:60428.service: Deactivated successfully. Feb 13 08:28:39.733809 systemd-logind[1548]: Session 76 logged out. Waiting for processes to exit. Feb 13 08:28:39.733828 systemd[1]: session-76.scope: Deactivated successfully. Feb 13 08:28:39.734390 systemd-logind[1548]: Removed session 76. Feb 13 08:28:39.650000 audit[10360]: CRED_ACQ pid=10360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.831999 kernel: audit: type=1101 audit(1707812919.649:1000): pid=10360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.832041 kernel: audit: type=1103 audit(1707812919.650:1001): pid=10360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.832059 kernel: audit: type=1006 audit(1707812919.650:1002): pid=10360 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=76 res=1 Feb 13 08:28:39.890569 kernel: audit: type=1300 audit(1707812919.650:1002): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6b5f78a0 a2=3 a3=0 items=0 ppid=1 pid=10360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:39.650000 audit[10360]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6b5f78a0 a2=3 a3=0 items=0 ppid=1 pid=10360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:39.982527 kernel: audit: type=1327 audit(1707812919.650:1002): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:39.650000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:40.012953 kernel: audit: type=1105 audit(1707812919.655:1003): pid=10360 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.655000 audit[10360]: USER_START pid=10360 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:40.107418 kernel: audit: type=1103 audit(1707812919.655:1004): pid=10363 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.655000 audit[10363]: CRED_ACQ pid=10363 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:40.196518 kernel: audit: type=1106 audit(1707812919.731:1005): pid=10360 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.731000 audit[10360]: USER_END pid=10360 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.731000 audit[10360]: CRED_DISP pid=10360 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:40.381200 kernel: audit: type=1104 audit(1707812919.731:1006): pid=10360 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:39.732000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.67.79:22-139.178.68.195:60428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.740772 systemd[1]: Started sshd@99-145.40.67.79:22-139.178.68.195:60442.service. Feb 13 08:28:44.740000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.67.79:22-139.178.68.195:60442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.768244 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:28:44.768337 kernel: audit: type=1130 audit(1707812924.740:1008): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.67.79:22-139.178.68.195:60442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.876000 audit[10389]: USER_ACCT pid=10389 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:44.877226 sshd[10389]: Accepted publickey for core from 139.178.68.195 port 60442 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:44.879290 sshd[10389]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:44.881717 systemd-logind[1548]: New session 77 of user core. Feb 13 08:28:44.882142 systemd[1]: Started session-77.scope. Feb 13 08:28:44.962632 sshd[10389]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:44.963953 systemd[1]: sshd@99-145.40.67.79:22-139.178.68.195:60442.service: Deactivated successfully. Feb 13 08:28:44.964593 systemd[1]: session-77.scope: Deactivated successfully. Feb 13 08:28:44.964615 systemd-logind[1548]: Session 77 logged out. Waiting for processes to exit. Feb 13 08:28:44.965022 systemd-logind[1548]: Removed session 77. Feb 13 08:28:44.878000 audit[10389]: CRED_ACQ pid=10389 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:45.058968 kernel: audit: type=1101 audit(1707812924.876:1009): pid=10389 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:45.059009 kernel: audit: type=1103 audit(1707812924.878:1010): pid=10389 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:45.059029 kernel: audit: type=1006 audit(1707812924.878:1011): pid=10389 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=77 res=1 Feb 13 08:28:45.117624 kernel: audit: type=1300 audit(1707812924.878:1011): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe43f25e70 a2=3 a3=0 items=0 ppid=1 pid=10389 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:44.878000 audit[10389]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe43f25e70 a2=3 a3=0 items=0 ppid=1 pid=10389 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:45.209508 kernel: audit: type=1327 audit(1707812924.878:1011): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:44.878000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:45.239956 kernel: audit: type=1105 audit(1707812924.883:1012): pid=10389 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:44.883000 audit[10389]: USER_START pid=10389 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:45.334362 kernel: audit: type=1103 audit(1707812924.883:1013): pid=10392 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:44.883000 audit[10392]: CRED_ACQ pid=10392 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:45.423451 kernel: audit: type=1106 audit(1707812924.962:1014): pid=10389 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:44.962000 audit[10389]: USER_END pid=10389 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:45.518878 kernel: audit: type=1104 audit(1707812924.962:1015): pid=10389 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:44.962000 audit[10389]: CRED_DISP pid=10389 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:44.963000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.67.79:22-139.178.68.195:60442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:49.969110 systemd[1]: Started sshd@100-145.40.67.79:22-139.178.68.195:43430.service. Feb 13 08:28:49.968000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.67.79:22-139.178.68.195:43430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:49.996053 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:28:49.996107 kernel: audit: type=1130 audit(1707812929.968:1017): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.67.79:22-139.178.68.195:43430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:50.106000 audit[10415]: USER_ACCT pid=10415 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.106914 sshd[10415]: Accepted publickey for core from 139.178.68.195 port 43430 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:50.109300 sshd[10415]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:50.111715 systemd-logind[1548]: New session 78 of user core. Feb 13 08:28:50.112122 systemd[1]: Started session-78.scope. Feb 13 08:28:50.188816 sshd[10415]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:50.190254 systemd[1]: sshd@100-145.40.67.79:22-139.178.68.195:43430.service: Deactivated successfully. Feb 13 08:28:50.190862 systemd-logind[1548]: Session 78 logged out. Waiting for processes to exit. Feb 13 08:28:50.190871 systemd[1]: session-78.scope: Deactivated successfully. Feb 13 08:28:50.191398 systemd-logind[1548]: Removed session 78. Feb 13 08:28:50.108000 audit[10415]: CRED_ACQ pid=10415 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.288804 kernel: audit: type=1101 audit(1707812930.106:1018): pid=10415 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.288863 kernel: audit: type=1103 audit(1707812930.108:1019): pid=10415 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.288880 kernel: audit: type=1006 audit(1707812930.108:1020): pid=10415 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=78 res=1 Feb 13 08:28:50.347377 kernel: audit: type=1300 audit(1707812930.108:1020): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd71b6aa80 a2=3 a3=0 items=0 ppid=1 pid=10415 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:50.108000 audit[10415]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd71b6aa80 a2=3 a3=0 items=0 ppid=1 pid=10415 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:50.439374 kernel: audit: type=1327 audit(1707812930.108:1020): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:50.108000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:50.113000 audit[10415]: USER_START pid=10415 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.564221 kernel: audit: type=1105 audit(1707812930.113:1021): pid=10415 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.564256 kernel: audit: type=1103 audit(1707812930.113:1022): pid=10418 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.113000 audit[10418]: CRED_ACQ pid=10418 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.653337 kernel: audit: type=1106 audit(1707812930.188:1023): pid=10415 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.188000 audit[10415]: USER_END pid=10415 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.188000 audit[10415]: CRED_DISP pid=10415 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.837976 kernel: audit: type=1104 audit(1707812930.188:1024): pid=10415 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:50.189000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.67.79:22-139.178.68.195:43430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.193767 systemd[1]: Started sshd@101-145.40.67.79:22-139.178.68.195:43436.service. Feb 13 08:28:55.193000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.67.79:22-139.178.68.195:43436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.234334 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:28:55.234457 kernel: audit: type=1130 audit(1707812935.193:1026): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.67.79:22-139.178.68.195:43436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.342000 audit[10484]: USER_ACCT pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.342652 sshd[10484]: Accepted publickey for core from 139.178.68.195 port 43436 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:28:55.344982 sshd[10484]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:28:55.347412 systemd-logind[1548]: New session 79 of user core. Feb 13 08:28:55.347817 systemd[1]: Started session-79.scope. Feb 13 08:28:55.424371 sshd[10484]: pam_unix(sshd:session): session closed for user core Feb 13 08:28:55.425751 systemd[1]: sshd@101-145.40.67.79:22-139.178.68.195:43436.service: Deactivated successfully. Feb 13 08:28:55.426414 systemd[1]: session-79.scope: Deactivated successfully. Feb 13 08:28:55.426455 systemd-logind[1548]: Session 79 logged out. Waiting for processes to exit. Feb 13 08:28:55.426900 systemd-logind[1548]: Removed session 79. Feb 13 08:28:55.343000 audit[10484]: CRED_ACQ pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.524306 kernel: audit: type=1101 audit(1707812935.342:1027): pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.524343 kernel: audit: type=1103 audit(1707812935.343:1028): pid=10484 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.524359 kernel: audit: type=1006 audit(1707812935.343:1029): pid=10484 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=79 res=1 Feb 13 08:28:55.582816 kernel: audit: type=1300 audit(1707812935.343:1029): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff1820f30 a2=3 a3=0 items=0 ppid=1 pid=10484 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:55.343000 audit[10484]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff1820f30 a2=3 a3=0 items=0 ppid=1 pid=10484 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:55.343000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:55.705130 kernel: audit: type=1327 audit(1707812935.343:1029): proctitle=737368643A20636F7265205B707269765D Feb 13 08:28:55.705161 kernel: audit: type=1105 audit(1707812935.348:1030): pid=10484 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.348000 audit[10484]: USER_START pid=10484 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.799519 kernel: audit: type=1103 audit(1707812935.349:1031): pid=10487 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.349000 audit[10487]: CRED_ACQ pid=10487 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.888562 kernel: audit: type=1106 audit(1707812935.424:1032): pid=10484 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.424000 audit[10484]: USER_END pid=10484 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.984005 kernel: audit: type=1104 audit(1707812935.424:1033): pid=10484 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.424000 audit[10484]: CRED_DISP pid=10484 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:28:55.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.67.79:22-139.178.68.195:43436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.430556 systemd[1]: Started sshd@102-145.40.67.79:22-139.178.68.195:50296.service. Feb 13 08:29:00.429000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.67.79:22-139.178.68.195:50296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.457625 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:00.457660 kernel: audit: type=1130 audit(1707812940.429:1035): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.67.79:22-139.178.68.195:50296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.567000 audit[10510]: USER_ACCT pid=10510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.567932 sshd[10510]: Accepted publickey for core from 139.178.68.195 port 50296 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:00.569208 sshd[10510]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:00.571515 systemd-logind[1548]: New session 80 of user core. Feb 13 08:29:00.572067 systemd[1]: Started session-80.scope. Feb 13 08:29:00.648689 sshd[10510]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:00.650068 systemd[1]: sshd@102-145.40.67.79:22-139.178.68.195:50296.service: Deactivated successfully. Feb 13 08:29:00.650729 systemd[1]: session-80.scope: Deactivated successfully. Feb 13 08:29:00.650738 systemd-logind[1548]: Session 80 logged out. Waiting for processes to exit. Feb 13 08:29:00.651365 systemd-logind[1548]: Removed session 80. Feb 13 08:29:00.568000 audit[10510]: CRED_ACQ pid=10510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.749722 kernel: audit: type=1101 audit(1707812940.567:1036): pid=10510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.749757 kernel: audit: type=1103 audit(1707812940.568:1037): pid=10510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.749774 kernel: audit: type=1006 audit(1707812940.568:1038): pid=10510 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=80 res=1 Feb 13 08:29:00.808248 kernel: audit: type=1300 audit(1707812940.568:1038): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd55d3c240 a2=3 a3=0 items=0 ppid=1 pid=10510 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:00.568000 audit[10510]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd55d3c240 a2=3 a3=0 items=0 ppid=1 pid=10510 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:00.900079 kernel: audit: type=1327 audit(1707812940.568:1038): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:00.568000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:00.930490 kernel: audit: type=1105 audit(1707812940.573:1039): pid=10510 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.573000 audit[10510]: USER_START pid=10510 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:01.024832 kernel: audit: type=1103 audit(1707812940.573:1040): pid=10513 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.573000 audit[10513]: CRED_ACQ pid=10513 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:01.114007 kernel: audit: type=1106 audit(1707812940.648:1041): pid=10510 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.648000 audit[10510]: USER_END pid=10510 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:01.209501 kernel: audit: type=1104 audit(1707812940.648:1042): pid=10510 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.648000 audit[10510]: CRED_DISP pid=10510 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:00.649000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.67.79:22-139.178.68.195:50296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:05.655488 systemd[1]: Started sshd@103-145.40.67.79:22-139.178.68.195:50304.service. Feb 13 08:29:05.654000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.67.79:22-139.178.68.195:50304 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:05.682437 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:05.682550 kernel: audit: type=1130 audit(1707812945.654:1044): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.67.79:22-139.178.68.195:50304 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:05.792000 audit[10561]: USER_ACCT pid=10561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.792889 sshd[10561]: Accepted publickey for core from 139.178.68.195 port 50304 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:05.793956 sshd[10561]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:05.796205 systemd-logind[1548]: New session 81 of user core. Feb 13 08:29:05.796681 systemd[1]: Started session-81.scope. Feb 13 08:29:05.873950 sshd[10561]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:05.875429 systemd[1]: sshd@103-145.40.67.79:22-139.178.68.195:50304.service: Deactivated successfully. Feb 13 08:29:05.875987 systemd-logind[1548]: Session 81 logged out. Waiting for processes to exit. Feb 13 08:29:05.876038 systemd[1]: session-81.scope: Deactivated successfully. Feb 13 08:29:05.876481 systemd-logind[1548]: Removed session 81. Feb 13 08:29:05.792000 audit[10561]: CRED_ACQ pid=10561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.974647 kernel: audit: type=1101 audit(1707812945.792:1045): pid=10561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.974684 kernel: audit: type=1103 audit(1707812945.792:1046): pid=10561 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.974701 kernel: audit: type=1006 audit(1707812945.792:1047): pid=10561 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=81 res=1 Feb 13 08:29:06.033228 kernel: audit: type=1300 audit(1707812945.792:1047): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffac1c4cd0 a2=3 a3=0 items=0 ppid=1 pid=10561 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:05.792000 audit[10561]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffac1c4cd0 a2=3 a3=0 items=0 ppid=1 pid=10561 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:06.125130 kernel: audit: type=1327 audit(1707812945.792:1047): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:05.792000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:06.155552 kernel: audit: type=1105 audit(1707812945.797:1048): pid=10561 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.797000 audit[10561]: USER_START pid=10561 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:06.249918 kernel: audit: type=1103 audit(1707812945.798:1049): pid=10564 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.798000 audit[10564]: CRED_ACQ pid=10564 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:06.339052 kernel: audit: type=1106 audit(1707812945.873:1050): pid=10561 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.873000 audit[10561]: USER_END pid=10561 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:06.434402 kernel: audit: type=1104 audit(1707812945.873:1051): pid=10561 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.873000 audit[10561]: CRED_DISP pid=10561 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:05.874000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.67.79:22-139.178.68.195:50304 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:10.875713 systemd[1]: Started sshd@104-145.40.67.79:22-139.178.68.195:55814.service. Feb 13 08:29:10.874000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.67.79:22-139.178.68.195:55814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:10.902047 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:10.902090 kernel: audit: type=1130 audit(1707812950.874:1053): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.67.79:22-139.178.68.195:55814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:11.011000 audit[10587]: USER_ACCT pid=10587 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.012482 sshd[10587]: Accepted publickey for core from 139.178.68.195 port 55814 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:11.014291 sshd[10587]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:11.016650 systemd-logind[1548]: New session 82 of user core. Feb 13 08:29:11.017070 systemd[1]: Started session-82.scope. Feb 13 08:29:11.094811 sshd[10587]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:11.096152 systemd[1]: sshd@104-145.40.67.79:22-139.178.68.195:55814.service: Deactivated successfully. Feb 13 08:29:11.096794 systemd[1]: session-82.scope: Deactivated successfully. Feb 13 08:29:11.096815 systemd-logind[1548]: Session 82 logged out. Waiting for processes to exit. Feb 13 08:29:11.097327 systemd-logind[1548]: Removed session 82. Feb 13 08:29:11.013000 audit[10587]: CRED_ACQ pid=10587 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.194298 kernel: audit: type=1101 audit(1707812951.011:1054): pid=10587 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.194334 kernel: audit: type=1103 audit(1707812951.013:1055): pid=10587 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.194352 kernel: audit: type=1006 audit(1707812951.013:1056): pid=10587 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=82 res=1 Feb 13 08:29:11.252896 kernel: audit: type=1300 audit(1707812951.013:1056): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe7e7111d0 a2=3 a3=0 items=0 ppid=1 pid=10587 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:11.013000 audit[10587]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe7e7111d0 a2=3 a3=0 items=0 ppid=1 pid=10587 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:11.344883 kernel: audit: type=1327 audit(1707812951.013:1056): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:11.013000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:11.375346 kernel: audit: type=1105 audit(1707812951.018:1057): pid=10587 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.018000 audit[10587]: USER_START pid=10587 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.469806 kernel: audit: type=1103 audit(1707812951.018:1058): pid=10590 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.018000 audit[10590]: CRED_ACQ pid=10590 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.558961 kernel: audit: type=1106 audit(1707812951.094:1059): pid=10587 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.094000 audit[10587]: USER_END pid=10587 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.654401 kernel: audit: type=1104 audit(1707812951.094:1060): pid=10587 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.094000 audit[10587]: CRED_DISP pid=10587 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:11.095000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.67.79:22-139.178.68.195:55814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:16.102119 systemd[1]: Started sshd@105-145.40.67.79:22-139.178.68.195:59132.service. Feb 13 08:29:16.101000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.67.79:22-139.178.68.195:59132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:16.129116 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:16.129200 kernel: audit: type=1130 audit(1707812956.101:1062): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.67.79:22-139.178.68.195:59132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:16.238000 audit[10614]: USER_ACCT pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.239874 sshd[10614]: Accepted publickey for core from 139.178.68.195 port 59132 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:16.242291 sshd[10614]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:16.244697 systemd-logind[1548]: New session 83 of user core. Feb 13 08:29:16.245130 systemd[1]: Started session-83.scope. Feb 13 08:29:16.322176 sshd[10614]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:16.323612 systemd[1]: sshd@105-145.40.67.79:22-139.178.68.195:59132.service: Deactivated successfully. Feb 13 08:29:16.324252 systemd[1]: session-83.scope: Deactivated successfully. Feb 13 08:29:16.324306 systemd-logind[1548]: Session 83 logged out. Waiting for processes to exit. Feb 13 08:29:16.324840 systemd-logind[1548]: Removed session 83. Feb 13 08:29:16.241000 audit[10614]: CRED_ACQ pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.421543 kernel: audit: type=1101 audit(1707812956.238:1063): pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.421581 kernel: audit: type=1103 audit(1707812956.241:1064): pid=10614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.421598 kernel: audit: type=1006 audit(1707812956.241:1065): pid=10614 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=83 res=1 Feb 13 08:29:16.480188 kernel: audit: type=1300 audit(1707812956.241:1065): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff80ae0d20 a2=3 a3=0 items=0 ppid=1 pid=10614 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:16.241000 audit[10614]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff80ae0d20 a2=3 a3=0 items=0 ppid=1 pid=10614 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:16.241000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:16.602474 kernel: audit: type=1327 audit(1707812956.241:1065): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:16.602504 kernel: audit: type=1105 audit(1707812956.246:1066): pid=10614 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.246000 audit[10614]: USER_START pid=10614 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.696816 kernel: audit: type=1103 audit(1707812956.246:1067): pid=10617 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.246000 audit[10617]: CRED_ACQ pid=10617 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.785935 kernel: audit: type=1106 audit(1707812956.321:1068): pid=10614 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.321000 audit[10614]: USER_END pid=10614 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.881345 kernel: audit: type=1104 audit(1707812956.321:1069): pid=10614 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.321000 audit[10614]: CRED_DISP pid=10614 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:16.322000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.67.79:22-139.178.68.195:59132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:21.328595 systemd[1]: Started sshd@106-145.40.67.79:22-139.178.68.195:59146.service. Feb 13 08:29:21.327000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.67.79:22-139.178.68.195:59146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:21.355349 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:21.355398 kernel: audit: type=1130 audit(1707812961.327:1071): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.67.79:22-139.178.68.195:59146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:21.463000 audit[10640]: USER_ACCT pid=10640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.464357 sshd[10640]: Accepted publickey for core from 139.178.68.195 port 59146 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:21.465298 sshd[10640]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:21.467741 systemd-logind[1548]: New session 84 of user core. Feb 13 08:29:21.468256 systemd[1]: Started session-84.scope. Feb 13 08:29:21.545366 sshd[10640]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:21.546713 systemd[1]: sshd@106-145.40.67.79:22-139.178.68.195:59146.service: Deactivated successfully. Feb 13 08:29:21.547382 systemd[1]: session-84.scope: Deactivated successfully. Feb 13 08:29:21.547423 systemd-logind[1548]: Session 84 logged out. Waiting for processes to exit. Feb 13 08:29:21.547899 systemd-logind[1548]: Removed session 84. Feb 13 08:29:21.464000 audit[10640]: CRED_ACQ pid=10640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.646595 kernel: audit: type=1101 audit(1707812961.463:1072): pid=10640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.646631 kernel: audit: type=1103 audit(1707812961.464:1073): pid=10640 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.646644 kernel: audit: type=1006 audit(1707812961.464:1074): pid=10640 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=84 res=1 Feb 13 08:29:21.705112 kernel: audit: type=1300 audit(1707812961.464:1074): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6f6c22e0 a2=3 a3=0 items=0 ppid=1 pid=10640 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:21.464000 audit[10640]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6f6c22e0 a2=3 a3=0 items=0 ppid=1 pid=10640 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:21.797082 kernel: audit: type=1327 audit(1707812961.464:1074): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:21.464000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:21.827474 kernel: audit: type=1105 audit(1707812961.469:1075): pid=10640 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.469000 audit[10640]: USER_START pid=10640 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.921796 kernel: audit: type=1103 audit(1707812961.469:1076): pid=10643 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.469000 audit[10643]: CRED_ACQ pid=10643 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:22.010908 kernel: audit: type=1106 audit(1707812961.545:1077): pid=10640 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.545000 audit[10640]: USER_END pid=10640 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:22.106409 kernel: audit: type=1104 audit(1707812961.545:1078): pid=10640 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.545000 audit[10640]: CRED_DISP pid=10640 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:21.545000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.67.79:22-139.178.68.195:59146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:26.552253 systemd[1]: Started sshd@107-145.40.67.79:22-139.178.68.195:52136.service. Feb 13 08:29:26.551000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.67.79:22-139.178.68.195:52136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:26.579599 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:26.579668 kernel: audit: type=1130 audit(1707812966.551:1080): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.67.79:22-139.178.68.195:52136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:26.688000 audit[10727]: USER_ACCT pid=10727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.690147 sshd[10727]: Accepted publickey for core from 139.178.68.195 port 52136 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:26.693028 sshd[10727]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:26.695604 systemd-logind[1548]: New session 85 of user core. Feb 13 08:29:26.696215 systemd[1]: Started session-85.scope. Feb 13 08:29:26.774600 sshd[10727]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:26.775843 systemd[1]: sshd@107-145.40.67.79:22-139.178.68.195:52136.service: Deactivated successfully. Feb 13 08:29:26.776517 systemd[1]: session-85.scope: Deactivated successfully. Feb 13 08:29:26.776531 systemd-logind[1548]: Session 85 logged out. Waiting for processes to exit. Feb 13 08:29:26.776961 systemd-logind[1548]: Removed session 85. Feb 13 08:29:26.691000 audit[10727]: CRED_ACQ pid=10727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.871869 kernel: audit: type=1101 audit(1707812966.688:1081): pid=10727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.871907 kernel: audit: type=1103 audit(1707812966.691:1082): pid=10727 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.871925 kernel: audit: type=1006 audit(1707812966.691:1083): pid=10727 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=85 res=1 Feb 13 08:29:26.930472 kernel: audit: type=1300 audit(1707812966.691:1083): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe49390f30 a2=3 a3=0 items=0 ppid=1 pid=10727 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:26.691000 audit[10727]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe49390f30 a2=3 a3=0 items=0 ppid=1 pid=10727 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:27.022387 kernel: audit: type=1327 audit(1707812966.691:1083): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:26.691000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:27.052846 kernel: audit: type=1105 audit(1707812966.697:1084): pid=10727 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.697000 audit[10727]: USER_START pid=10727 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:27.147215 kernel: audit: type=1103 audit(1707812966.698:1085): pid=10730 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.698000 audit[10730]: CRED_ACQ pid=10730 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:27.236317 kernel: audit: type=1106 audit(1707812966.774:1086): pid=10727 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.774000 audit[10727]: USER_END pid=10727 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:27.331750 kernel: audit: type=1104 audit(1707812966.774:1087): pid=10727 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.774000 audit[10727]: CRED_DISP pid=10727 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:26.775000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.67.79:22-139.178.68.195:52136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:31.781112 systemd[1]: Started sshd@108-145.40.67.79:22-139.178.68.195:52138.service. Feb 13 08:29:31.780000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.67.79:22-139.178.68.195:52138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:31.808055 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:31.808129 kernel: audit: type=1130 audit(1707812971.780:1089): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.67.79:22-139.178.68.195:52138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:31.917000 audit[10753]: USER_ACCT pid=10753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:31.918511 sshd[10753]: Accepted publickey for core from 139.178.68.195 port 52138 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:31.920313 sshd[10753]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:31.922735 systemd-logind[1548]: New session 86 of user core. Feb 13 08:29:31.923277 systemd[1]: Started session-86.scope. Feb 13 08:29:32.005402 sshd[10753]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:32.006928 systemd[1]: sshd@108-145.40.67.79:22-139.178.68.195:52138.service: Deactivated successfully. Feb 13 08:29:32.007778 systemd-logind[1548]: Session 86 logged out. Waiting for processes to exit. Feb 13 08:29:32.007803 systemd[1]: session-86.scope: Deactivated successfully. Feb 13 08:29:32.008458 systemd-logind[1548]: Removed session 86. Feb 13 08:29:31.919000 audit[10753]: CRED_ACQ pid=10753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.102598 kernel: audit: type=1101 audit(1707812971.917:1090): pid=10753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.102643 kernel: audit: type=1103 audit(1707812971.919:1091): pid=10753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.102660 kernel: audit: type=1006 audit(1707812971.919:1092): pid=10753 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=86 res=1 Feb 13 08:29:31.919000 audit[10753]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc4d19cbb0 a2=3 a3=0 items=0 ppid=1 pid=10753 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:32.253081 kernel: audit: type=1300 audit(1707812971.919:1092): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc4d19cbb0 a2=3 a3=0 items=0 ppid=1 pid=10753 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:32.253118 kernel: audit: type=1327 audit(1707812971.919:1092): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:31.919000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:32.283520 kernel: audit: type=1105 audit(1707812971.925:1093): pid=10753 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:31.925000 audit[10753]: USER_START pid=10753 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:31.925000 audit[10756]: CRED_ACQ pid=10756 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.467085 kernel: audit: type=1103 audit(1707812971.925:1094): pid=10756 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.467117 kernel: audit: type=1106 audit(1707812972.005:1095): pid=10753 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.005000 audit[10753]: USER_END pid=10753 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.562506 kernel: audit: type=1104 audit(1707812972.005:1096): pid=10753 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.005000 audit[10753]: CRED_DISP pid=10753 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:32.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.67.79:22-139.178.68.195:52138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:37.012149 systemd[1]: Started sshd@109-145.40.67.79:22-139.178.68.195:39604.service. Feb 13 08:29:37.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.67.79:22-139.178.68.195:39604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:37.039196 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:37.039271 kernel: audit: type=1130 audit(1707812977.011:1098): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.67.79:22-139.178.68.195:39604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:37.148000 audit[10808]: USER_ACCT pid=10808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.149509 sshd[10808]: Accepted publickey for core from 139.178.68.195 port 39604 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:37.152292 sshd[10808]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:37.154711 systemd-logind[1548]: New session 87 of user core. Feb 13 08:29:37.155161 systemd[1]: Started session-87.scope. Feb 13 08:29:37.151000 audit[10808]: CRED_ACQ pid=10808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.331238 kernel: audit: type=1101 audit(1707812977.148:1099): pid=10808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.331277 kernel: audit: type=1103 audit(1707812977.151:1100): pid=10808 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.331295 kernel: audit: type=1006 audit(1707812977.151:1101): pid=10808 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=87 res=1 Feb 13 08:29:37.390017 kernel: audit: type=1300 audit(1707812977.151:1101): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe22466df0 a2=3 a3=0 items=0 ppid=1 pid=10808 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:37.151000 audit[10808]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe22466df0 a2=3 a3=0 items=0 ppid=1 pid=10808 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:37.481766 kernel: audit: type=1327 audit(1707812977.151:1101): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:37.151000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:37.482000 sshd[10808]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:37.483529 systemd[1]: sshd@109-145.40.67.79:22-139.178.68.195:39604.service: Deactivated successfully. Feb 13 08:29:37.484156 systemd-logind[1548]: Session 87 logged out. Waiting for processes to exit. Feb 13 08:29:37.484185 systemd[1]: session-87.scope: Deactivated successfully. Feb 13 08:29:37.484775 systemd-logind[1548]: Removed session 87. Feb 13 08:29:37.512241 kernel: audit: type=1105 audit(1707812977.157:1102): pid=10808 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.157000 audit[10808]: USER_START pid=10808 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.606625 kernel: audit: type=1103 audit(1707812977.158:1103): pid=10811 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.158000 audit[10811]: CRED_ACQ pid=10811 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.695742 kernel: audit: type=1106 audit(1707812977.481:1104): pid=10808 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.481000 audit[10808]: USER_END pid=10808 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.791136 kernel: audit: type=1104 audit(1707812977.481:1105): pid=10808 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.481000 audit[10808]: CRED_DISP pid=10808 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:37.482000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.67.79:22-139.178.68.195:39604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:42.431058 systemd[1]: Started sshd@110-145.40.67.79:22-139.178.68.195:39620.service. Feb 13 08:29:42.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.67.79:22-139.178.68.195:39620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:42.458593 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:42.458625 kernel: audit: type=1130 audit(1707812982.430:1107): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.67.79:22-139.178.68.195:39620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:42.567000 audit[10836]: USER_ACCT pid=10836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.568894 sshd[10836]: Accepted publickey for core from 139.178.68.195 port 39620 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:42.570071 sshd[10836]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:42.572431 systemd-logind[1548]: New session 88 of user core. Feb 13 08:29:42.572919 systemd[1]: Started session-88.scope. Feb 13 08:29:42.650832 sshd[10836]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:42.652287 systemd[1]: sshd@110-145.40.67.79:22-139.178.68.195:39620.service: Deactivated successfully. Feb 13 08:29:42.652927 systemd-logind[1548]: Session 88 logged out. Waiting for processes to exit. Feb 13 08:29:42.652934 systemd[1]: session-88.scope: Deactivated successfully. Feb 13 08:29:42.653455 systemd-logind[1548]: Removed session 88. Feb 13 08:29:42.568000 audit[10836]: CRED_ACQ pid=10836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.750938 kernel: audit: type=1101 audit(1707812982.567:1108): pid=10836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.750974 kernel: audit: type=1103 audit(1707812982.568:1109): pid=10836 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.750998 kernel: audit: type=1006 audit(1707812982.568:1110): pid=10836 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=88 res=1 Feb 13 08:29:42.809483 kernel: audit: type=1300 audit(1707812982.568:1110): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf7e14af0 a2=3 a3=0 items=0 ppid=1 pid=10836 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:42.568000 audit[10836]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcf7e14af0 a2=3 a3=0 items=0 ppid=1 pid=10836 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:42.901331 kernel: audit: type=1327 audit(1707812982.568:1110): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:42.568000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:42.931726 kernel: audit: type=1105 audit(1707812982.574:1111): pid=10836 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.574000 audit[10836]: USER_START pid=10836 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:43.026069 kernel: audit: type=1103 audit(1707812982.574:1112): pid=10839 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.574000 audit[10839]: CRED_ACQ pid=10839 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:43.115081 kernel: audit: type=1106 audit(1707812982.650:1113): pid=10836 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.650000 audit[10836]: USER_END pid=10836 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:43.210462 kernel: audit: type=1104 audit(1707812982.650:1114): pid=10836 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.650000 audit[10836]: CRED_DISP pid=10836 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:42.651000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.67.79:22-139.178.68.195:39620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:47.653872 systemd[1]: Started sshd@111-145.40.67.79:22-139.178.68.195:38814.service. Feb 13 08:29:47.652000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.67.79:22-139.178.68.195:38814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:47.680927 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:47.680972 kernel: audit: type=1130 audit(1707812987.652:1116): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.67.79:22-139.178.68.195:38814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:47.790000 audit[10862]: USER_ACCT pid=10862 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.791467 sshd[10862]: Accepted publickey for core from 139.178.68.195 port 38814 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:47.792008 sshd[10862]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:47.794260 systemd-logind[1548]: New session 89 of user core. Feb 13 08:29:47.794746 systemd[1]: Started session-89.scope. Feb 13 08:29:47.871379 sshd[10862]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:47.872701 systemd[1]: sshd@111-145.40.67.79:22-139.178.68.195:38814.service: Deactivated successfully. Feb 13 08:29:47.873298 systemd-logind[1548]: Session 89 logged out. Waiting for processes to exit. Feb 13 08:29:47.873346 systemd[1]: session-89.scope: Deactivated successfully. Feb 13 08:29:47.873819 systemd-logind[1548]: Removed session 89. Feb 13 08:29:47.790000 audit[10862]: CRED_ACQ pid=10862 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.973076 kernel: audit: type=1101 audit(1707812987.790:1117): pid=10862 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.973114 kernel: audit: type=1103 audit(1707812987.790:1118): pid=10862 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.973130 kernel: audit: type=1006 audit(1707812987.790:1119): pid=10862 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=89 res=1 Feb 13 08:29:48.031613 kernel: audit: type=1300 audit(1707812987.790:1119): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd61f574f0 a2=3 a3=0 items=0 ppid=1 pid=10862 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:47.790000 audit[10862]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd61f574f0 a2=3 a3=0 items=0 ppid=1 pid=10862 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:48.123405 kernel: audit: type=1327 audit(1707812987.790:1119): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:47.790000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:48.153776 kernel: audit: type=1105 audit(1707812987.795:1120): pid=10862 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.795000 audit[10862]: USER_START pid=10862 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:48.248075 kernel: audit: type=1103 audit(1707812987.796:1121): pid=10865 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.796000 audit[10865]: CRED_ACQ pid=10865 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:48.337066 kernel: audit: type=1106 audit(1707812987.871:1122): pid=10862 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.871000 audit[10862]: USER_END pid=10862 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:48.432366 kernel: audit: type=1104 audit(1707812987.871:1123): pid=10862 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.871000 audit[10862]: CRED_DISP pid=10862 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:47.871000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.67.79:22-139.178.68.195:38814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:52.878248 systemd[1]: Started sshd@112-145.40.67.79:22-139.178.68.195:38828.service. Feb 13 08:29:52.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.67.79:22-139.178.68.195:38828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:52.905034 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:52.905101 kernel: audit: type=1130 audit(1707812992.877:1125): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.67.79:22-139.178.68.195:38828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:53.014000 audit[10888]: USER_ACCT pid=10888 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.015953 sshd[10888]: Accepted publickey for core from 139.178.68.195 port 38828 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:53.017282 sshd[10888]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:53.019656 systemd-logind[1548]: New session 90 of user core. Feb 13 08:29:53.020181 systemd[1]: Started session-90.scope. Feb 13 08:29:53.097051 sshd[10888]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:53.098542 systemd[1]: sshd@112-145.40.67.79:22-139.178.68.195:38828.service: Deactivated successfully. Feb 13 08:29:53.099201 systemd-logind[1548]: Session 90 logged out. Waiting for processes to exit. Feb 13 08:29:53.099213 systemd[1]: session-90.scope: Deactivated successfully. Feb 13 08:29:53.099649 systemd-logind[1548]: Removed session 90. Feb 13 08:29:53.016000 audit[10888]: CRED_ACQ pid=10888 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.197513 kernel: audit: type=1101 audit(1707812993.014:1126): pid=10888 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.197549 kernel: audit: type=1103 audit(1707812993.016:1127): pid=10888 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.197565 kernel: audit: type=1006 audit(1707812993.016:1128): pid=10888 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=90 res=1 Feb 13 08:29:53.255981 kernel: audit: type=1300 audit(1707812993.016:1128): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffda7cb2220 a2=3 a3=0 items=0 ppid=1 pid=10888 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.016000 audit[10888]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffda7cb2220 a2=3 a3=0 items=0 ppid=1 pid=10888 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.347769 kernel: audit: type=1327 audit(1707812993.016:1128): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:53.016000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:53.378169 kernel: audit: type=1105 audit(1707812993.021:1129): pid=10888 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.021000 audit[10888]: USER_START pid=10888 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.021000 audit[10891]: CRED_ACQ pid=10891 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.561529 kernel: audit: type=1103 audit(1707812993.021:1130): pid=10891 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.561563 kernel: audit: type=1106 audit(1707812993.096:1131): pid=10888 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.096000 audit[10888]: USER_END pid=10888 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.656816 kernel: audit: type=1104 audit(1707812993.096:1132): pid=10888 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.096000 audit[10888]: CRED_DISP pid=10888 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:53.097000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.67.79:22-139.178.68.195:38828 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:58.103941 systemd[1]: Started sshd@113-145.40.67.79:22-139.178.68.195:60310.service. Feb 13 08:29:58.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.67.79:22-139.178.68.195:60310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:58.130861 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:29:58.130899 kernel: audit: type=1130 audit(1707812998.103:1134): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.67.79:22-139.178.68.195:60310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:58.240000 audit[10969]: USER_ACCT pid=10969 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.241244 sshd[10969]: Accepted publickey for core from 139.178.68.195 port 60310 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:58.243264 sshd[10969]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:58.245526 systemd-logind[1548]: New session 91 of user core. Feb 13 08:29:58.245984 systemd[1]: Started session-91.scope. Feb 13 08:29:58.322883 sshd[10969]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:58.324360 systemd[1]: sshd@113-145.40.67.79:22-139.178.68.195:60310.service: Deactivated successfully. Feb 13 08:29:58.324914 systemd-logind[1548]: Session 91 logged out. Waiting for processes to exit. Feb 13 08:29:58.324947 systemd[1]: session-91.scope: Deactivated successfully. Feb 13 08:29:58.325430 systemd-logind[1548]: Removed session 91. Feb 13 08:29:58.242000 audit[10969]: CRED_ACQ pid=10969 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.424090 kernel: audit: type=1101 audit(1707812998.240:1135): pid=10969 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.424140 kernel: audit: type=1103 audit(1707812998.242:1136): pid=10969 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.424159 kernel: audit: type=1006 audit(1707812998.242:1137): pid=10969 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=91 res=1 Feb 13 08:29:58.482657 kernel: audit: type=1300 audit(1707812998.242:1137): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3acacb30 a2=3 a3=0 items=0 ppid=1 pid=10969 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:58.242000 audit[10969]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3acacb30 a2=3 a3=0 items=0 ppid=1 pid=10969 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:58.574445 kernel: audit: type=1327 audit(1707812998.242:1137): proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:58.242000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:58.604839 kernel: audit: type=1105 audit(1707812998.247:1138): pid=10969 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.247000 audit[10969]: USER_START pid=10969 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.699074 kernel: audit: type=1103 audit(1707812998.247:1139): pid=10972 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.247000 audit[10972]: CRED_ACQ pid=10972 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.322000 audit[10969]: USER_END pid=10969 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.883454 kernel: audit: type=1106 audit(1707812998.322:1140): pid=10969 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.883479 kernel: audit: type=1104 audit(1707812998.322:1141): pid=10969 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.322000 audit[10969]: CRED_DISP pid=10969 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:58.323000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.67.79:22-139.178.68.195:60310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:03.329973 systemd[1]: Started sshd@114-145.40.67.79:22-139.178.68.195:60318.service. Feb 13 08:30:03.329000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.67.79:22-139.178.68.195:60318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:03.356984 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:30:03.357042 kernel: audit: type=1130 audit(1707813003.329:1143): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.67.79:22-139.178.68.195:60318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:03.466000 audit[11024]: USER_ACCT pid=11024 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.467466 sshd[11024]: Accepted publickey for core from 139.178.68.195 port 60318 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:30:03.469167 sshd[11024]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:30:03.471292 systemd-logind[1548]: New session 92 of user core. Feb 13 08:30:03.471841 systemd[1]: Started session-92.scope. Feb 13 08:30:03.549913 sshd[11024]: pam_unix(sshd:session): session closed for user core Feb 13 08:30:03.551408 systemd[1]: sshd@114-145.40.67.79:22-139.178.68.195:60318.service: Deactivated successfully. Feb 13 08:30:03.551966 systemd-logind[1548]: Session 92 logged out. Waiting for processes to exit. Feb 13 08:30:03.552005 systemd[1]: session-92.scope: Deactivated successfully. Feb 13 08:30:03.552517 systemd-logind[1548]: Removed session 92. Feb 13 08:30:03.468000 audit[11024]: CRED_ACQ pid=11024 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.649381 kernel: audit: type=1101 audit(1707813003.466:1144): pid=11024 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.649428 kernel: audit: type=1103 audit(1707813003.468:1145): pid=11024 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.649448 kernel: audit: type=1006 audit(1707813003.468:1146): pid=11024 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=92 res=1 Feb 13 08:30:03.707907 kernel: audit: type=1300 audit(1707813003.468:1146): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd119e18c0 a2=3 a3=0 items=0 ppid=1 pid=11024 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:30:03.468000 audit[11024]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd119e18c0 a2=3 a3=0 items=0 ppid=1 pid=11024 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:30:03.799758 kernel: audit: type=1327 audit(1707813003.468:1146): proctitle=737368643A20636F7265205B707269765D Feb 13 08:30:03.468000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:30:03.830171 kernel: audit: type=1105 audit(1707813003.473:1147): pid=11024 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.473000 audit[11024]: USER_START pid=11024 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.924512 kernel: audit: type=1103 audit(1707813003.473:1148): pid=11027 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.473000 audit[11027]: CRED_ACQ pid=11027 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:04.013680 kernel: audit: type=1106 audit(1707813003.549:1149): pid=11024 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.549000 audit[11024]: USER_END pid=11024 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:04.109078 kernel: audit: type=1104 audit(1707813003.549:1150): pid=11024 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.549000 audit[11024]: CRED_DISP pid=11024 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:03.550000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.67.79:22-139.178.68.195:60318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:08.556134 systemd[1]: Started sshd@115-145.40.67.79:22-139.178.68.195:53910.service. Feb 13 08:30:08.555000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.67.79:22-139.178.68.195:53910 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:08.582963 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:30:08.583053 kernel: audit: type=1130 audit(1707813008.555:1152): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.67.79:22-139.178.68.195:53910 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:08.692000 audit[11051]: USER_ACCT pid=11051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.693891 sshd[11051]: Accepted publickey for core from 139.178.68.195 port 53910 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:30:08.695332 sshd[11051]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:30:08.697722 systemd-logind[1548]: New session 93 of user core. Feb 13 08:30:08.698291 systemd[1]: Started session-93.scope. Feb 13 08:30:08.694000 audit[11051]: CRED_ACQ pid=11051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.786541 sshd[11051]: pam_unix(sshd:session): session closed for user core Feb 13 08:30:08.787924 systemd[1]: sshd@115-145.40.67.79:22-139.178.68.195:53910.service: Deactivated successfully. Feb 13 08:30:08.788540 systemd-logind[1548]: Session 93 logged out. Waiting for processes to exit. Feb 13 08:30:08.788546 systemd[1]: session-93.scope: Deactivated successfully. Feb 13 08:30:08.788946 systemd-logind[1548]: Removed session 93. Feb 13 08:30:08.875650 kernel: audit: type=1101 audit(1707813008.692:1153): pid=11051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.875695 kernel: audit: type=1103 audit(1707813008.694:1154): pid=11051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.875713 kernel: audit: type=1006 audit(1707813008.694:1155): pid=11051 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=93 res=1 Feb 13 08:30:08.694000 audit[11051]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcdcf43a00 a2=3 a3=0 items=0 ppid=1 pid=11051 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:30:09.026153 kernel: audit: type=1300 audit(1707813008.694:1155): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcdcf43a00 a2=3 a3=0 items=0 ppid=1 pid=11051 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:30:09.026184 kernel: audit: type=1327 audit(1707813008.694:1155): proctitle=737368643A20636F7265205B707269765D Feb 13 08:30:08.694000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:30:09.056626 kernel: audit: type=1105 audit(1707813008.699:1156): pid=11051 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.699000 audit[11051]: USER_START pid=11051 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:09.151014 kernel: audit: type=1103 audit(1707813008.700:1157): pid=11054 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.700000 audit[11054]: CRED_ACQ pid=11054 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:09.240146 kernel: audit: type=1106 audit(1707813008.786:1158): pid=11051 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.786000 audit[11051]: USER_END pid=11051 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:09.335638 kernel: audit: type=1104 audit(1707813008.786:1159): pid=11051 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.786000 audit[11051]: CRED_DISP pid=11051 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:30:08.787000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.67.79:22-139.178.68.195:53910 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'