Feb 13 08:28:38.549741 kernel: Linux version 5.15.148-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Feb 12 18:05:31 -00 2024 Feb 13 08:28:38.549753 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 08:28:38.549760 kernel: BIOS-provided physical RAM map: Feb 13 08:28:38.549764 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Feb 13 08:28:38.549768 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Feb 13 08:28:38.549771 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Feb 13 08:28:38.549776 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Feb 13 08:28:38.549780 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Feb 13 08:28:38.549784 kernel: BIOS-e820: [mem 0x0000000040400000-0x00000000820dcfff] usable Feb 13 08:28:38.549787 kernel: BIOS-e820: [mem 0x00000000820dd000-0x00000000820ddfff] ACPI NVS Feb 13 08:28:38.549792 kernel: BIOS-e820: [mem 0x00000000820de000-0x00000000820defff] reserved Feb 13 08:28:38.549796 kernel: BIOS-e820: [mem 0x00000000820df000-0x000000008afccfff] usable Feb 13 08:28:38.549799 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Feb 13 08:28:38.549803 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Feb 13 08:28:38.549808 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Feb 13 08:28:38.549813 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Feb 13 08:28:38.549818 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Feb 13 08:28:38.549822 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Feb 13 08:28:38.549826 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Feb 13 08:28:38.549830 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Feb 13 08:28:38.549834 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Feb 13 08:28:38.549838 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 08:28:38.549842 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Feb 13 08:28:38.549846 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Feb 13 08:28:38.549851 kernel: NX (Execute Disable) protection: active Feb 13 08:28:38.549855 kernel: SMBIOS 3.2.1 present. Feb 13 08:28:38.549860 kernel: DMI: Supermicro X11SCM-F/X11SCM-F, BIOS 1.9 09/16/2022 Feb 13 08:28:38.549864 kernel: tsc: Detected 3400.000 MHz processor Feb 13 08:28:38.549868 kernel: tsc: Detected 3399.906 MHz TSC Feb 13 08:28:38.549873 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 08:28:38.549877 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 08:28:38.549882 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Feb 13 08:28:38.549886 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 08:28:38.549890 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Feb 13 08:28:38.549895 kernel: Using GB pages for direct mapping Feb 13 08:28:38.549899 kernel: ACPI: Early table checksum verification disabled Feb 13 08:28:38.549904 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Feb 13 08:28:38.549908 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Feb 13 08:28:38.549913 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Feb 13 08:28:38.549917 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Feb 13 08:28:38.549923 kernel: ACPI: FACS 0x000000008C66CF80 000040 Feb 13 08:28:38.549928 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Feb 13 08:28:38.549934 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Feb 13 08:28:38.549938 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Feb 13 08:28:38.549943 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Feb 13 08:28:38.549948 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Feb 13 08:28:38.549952 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Feb 13 08:28:38.549957 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Feb 13 08:28:38.549962 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Feb 13 08:28:38.549966 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:28:38.549972 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Feb 13 08:28:38.549977 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Feb 13 08:28:38.549981 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:28:38.549986 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:28:38.549991 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Feb 13 08:28:38.549995 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Feb 13 08:28:38.550000 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:28:38.550005 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Feb 13 08:28:38.550010 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Feb 13 08:28:38.550015 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Feb 13 08:28:38.550019 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Feb 13 08:28:38.550024 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Feb 13 08:28:38.550029 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Feb 13 08:28:38.550033 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Feb 13 08:28:38.550038 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Feb 13 08:28:38.550043 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Feb 13 08:28:38.550047 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Feb 13 08:28:38.550053 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Feb 13 08:28:38.550057 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Feb 13 08:28:38.550062 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Feb 13 08:28:38.550067 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Feb 13 08:28:38.550071 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Feb 13 08:28:38.550076 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Feb 13 08:28:38.550080 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Feb 13 08:28:38.550085 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Feb 13 08:28:38.550091 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Feb 13 08:28:38.550095 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Feb 13 08:28:38.550100 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Feb 13 08:28:38.550105 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Feb 13 08:28:38.550109 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Feb 13 08:28:38.550114 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Feb 13 08:28:38.550118 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Feb 13 08:28:38.550123 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Feb 13 08:28:38.550128 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Feb 13 08:28:38.550133 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Feb 13 08:28:38.550138 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Feb 13 08:28:38.550142 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Feb 13 08:28:38.550147 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Feb 13 08:28:38.550152 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Feb 13 08:28:38.550156 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Feb 13 08:28:38.550161 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Feb 13 08:28:38.550165 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Feb 13 08:28:38.550170 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Feb 13 08:28:38.550176 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Feb 13 08:28:38.550180 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Feb 13 08:28:38.550185 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Feb 13 08:28:38.550189 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Feb 13 08:28:38.550194 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Feb 13 08:28:38.550199 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Feb 13 08:28:38.550203 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Feb 13 08:28:38.550208 kernel: No NUMA configuration found Feb 13 08:28:38.550213 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Feb 13 08:28:38.550218 kernel: NODE_DATA(0) allocated [mem 0x86effa000-0x86effffff] Feb 13 08:28:38.550223 kernel: Zone ranges: Feb 13 08:28:38.550228 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 08:28:38.550232 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 13 08:28:38.550237 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Feb 13 08:28:38.550242 kernel: Movable zone start for each node Feb 13 08:28:38.550246 kernel: Early memory node ranges Feb 13 08:28:38.550251 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Feb 13 08:28:38.550256 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Feb 13 08:28:38.550260 kernel: node 0: [mem 0x0000000040400000-0x00000000820dcfff] Feb 13 08:28:38.550266 kernel: node 0: [mem 0x00000000820df000-0x000000008afccfff] Feb 13 08:28:38.550270 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Feb 13 08:28:38.550275 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Feb 13 08:28:38.550280 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Feb 13 08:28:38.550284 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Feb 13 08:28:38.550289 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 08:28:38.550297 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Feb 13 08:28:38.550303 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Feb 13 08:28:38.550308 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Feb 13 08:28:38.550313 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Feb 13 08:28:38.550319 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Feb 13 08:28:38.550324 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Feb 13 08:28:38.550329 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Feb 13 08:28:38.550334 kernel: ACPI: PM-Timer IO Port: 0x1808 Feb 13 08:28:38.550339 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 08:28:38.550344 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 08:28:38.550349 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 08:28:38.550355 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 08:28:38.550360 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 08:28:38.550365 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 08:28:38.550370 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 08:28:38.550375 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 08:28:38.550380 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 08:28:38.550384 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 08:28:38.550389 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 08:28:38.550394 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 08:28:38.550400 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 08:28:38.550405 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 08:28:38.550429 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 08:28:38.550434 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 08:28:38.550440 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Feb 13 08:28:38.550458 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 08:28:38.550463 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 08:28:38.550468 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 08:28:38.550473 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 08:28:38.550479 kernel: TSC deadline timer available Feb 13 08:28:38.550484 kernel: smpboot: Allowing 16 CPUs, 0 hotplug CPUs Feb 13 08:28:38.550489 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Feb 13 08:28:38.550494 kernel: Booting paravirtualized kernel on bare hardware Feb 13 08:28:38.550499 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 08:28:38.550504 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:16 nr_node_ids:1 Feb 13 08:28:38.550509 kernel: percpu: Embedded 55 pages/cpu s185624 r8192 d31464 u262144 Feb 13 08:28:38.550514 kernel: pcpu-alloc: s185624 r8192 d31464 u262144 alloc=1*2097152 Feb 13 08:28:38.550519 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Feb 13 08:28:38.550525 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8232415 Feb 13 08:28:38.550530 kernel: Policy zone: Normal Feb 13 08:28:38.550536 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 08:28:38.550541 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 08:28:38.550546 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Feb 13 08:28:38.550551 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 13 08:28:38.550556 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 08:28:38.550562 kernel: Memory: 32724720K/33452980K available (12294K kernel code, 2275K rwdata, 13700K rodata, 45496K init, 4048K bss, 728000K reserved, 0K cma-reserved) Feb 13 08:28:38.550567 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Feb 13 08:28:38.550572 kernel: ftrace: allocating 34475 entries in 135 pages Feb 13 08:28:38.550577 kernel: ftrace: allocated 135 pages with 4 groups Feb 13 08:28:38.550582 kernel: rcu: Hierarchical RCU implementation. Feb 13 08:28:38.550587 kernel: rcu: RCU event tracing is enabled. Feb 13 08:28:38.550593 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Feb 13 08:28:38.550598 kernel: Rude variant of Tasks RCU enabled. Feb 13 08:28:38.550603 kernel: Tracing variant of Tasks RCU enabled. Feb 13 08:28:38.550609 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 08:28:38.550614 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Feb 13 08:28:38.550619 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Feb 13 08:28:38.550624 kernel: random: crng init done Feb 13 08:28:38.550628 kernel: Console: colour dummy device 80x25 Feb 13 08:28:38.550633 kernel: printk: console [tty0] enabled Feb 13 08:28:38.550638 kernel: printk: console [ttyS1] enabled Feb 13 08:28:38.550644 kernel: ACPI: Core revision 20210730 Feb 13 08:28:38.550649 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Feb 13 08:28:38.550654 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 08:28:38.550659 kernel: DMAR: Host address width 39 Feb 13 08:28:38.550664 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Feb 13 08:28:38.550669 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Feb 13 08:28:38.550674 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Feb 13 08:28:38.550679 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Feb 13 08:28:38.550684 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Feb 13 08:28:38.550689 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Feb 13 08:28:38.550694 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Feb 13 08:28:38.550699 kernel: x2apic enabled Feb 13 08:28:38.550705 kernel: Switched APIC routing to cluster x2apic. Feb 13 08:28:38.550710 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Feb 13 08:28:38.550715 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Feb 13 08:28:38.550720 kernel: CPU0: Thermal monitoring enabled (TM1) Feb 13 08:28:38.550725 kernel: process: using mwait in idle threads Feb 13 08:28:38.550730 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 08:28:38.550735 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 08:28:38.550740 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 08:28:38.550745 kernel: Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks! Feb 13 08:28:38.550751 kernel: Spectre V2 : Mitigation: Enhanced IBRS Feb 13 08:28:38.550756 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 08:28:38.550761 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 08:28:38.550766 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 08:28:38.550771 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 08:28:38.550776 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp Feb 13 08:28:38.550781 kernel: TAA: Mitigation: TSX disabled Feb 13 08:28:38.550786 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Feb 13 08:28:38.550791 kernel: SRBDS: Mitigation: Microcode Feb 13 08:28:38.550796 kernel: GDS: Vulnerable: No microcode Feb 13 08:28:38.550801 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 08:28:38.550806 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 08:28:38.550811 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 08:28:38.550816 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Feb 13 08:28:38.550821 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Feb 13 08:28:38.550826 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 08:28:38.550831 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Feb 13 08:28:38.550836 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Feb 13 08:28:38.550841 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Feb 13 08:28:38.550846 kernel: Freeing SMP alternatives memory: 32K Feb 13 08:28:38.550851 kernel: pid_max: default: 32768 minimum: 301 Feb 13 08:28:38.550856 kernel: LSM: Security Framework initializing Feb 13 08:28:38.550861 kernel: SELinux: Initializing. Feb 13 08:28:38.550867 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 08:28:38.550872 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Feb 13 08:28:38.550877 kernel: smpboot: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Feb 13 08:28:38.550882 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 08:28:38.550887 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Feb 13 08:28:38.550892 kernel: ... version: 4 Feb 13 08:28:38.550897 kernel: ... bit width: 48 Feb 13 08:28:38.550902 kernel: ... generic registers: 4 Feb 13 08:28:38.550907 kernel: ... value mask: 0000ffffffffffff Feb 13 08:28:38.550912 kernel: ... max period: 00007fffffffffff Feb 13 08:28:38.550918 kernel: ... fixed-purpose events: 3 Feb 13 08:28:38.550923 kernel: ... event mask: 000000070000000f Feb 13 08:28:38.550928 kernel: signal: max sigframe size: 2032 Feb 13 08:28:38.550933 kernel: rcu: Hierarchical SRCU implementation. Feb 13 08:28:38.550938 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Feb 13 08:28:38.550943 kernel: smp: Bringing up secondary CPUs ... Feb 13 08:28:38.550948 kernel: x86: Booting SMP configuration: Feb 13 08:28:38.550953 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 Feb 13 08:28:38.550958 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Feb 13 08:28:38.550964 kernel: #9 #10 #11 #12 #13 #14 #15 Feb 13 08:28:38.550969 kernel: smp: Brought up 1 node, 16 CPUs Feb 13 08:28:38.550974 kernel: smpboot: Max logical packages: 1 Feb 13 08:28:38.550979 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Feb 13 08:28:38.550983 kernel: devtmpfs: initialized Feb 13 08:28:38.550989 kernel: x86/mm: Memory block size: 128MB Feb 13 08:28:38.550994 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x820dd000-0x820ddfff] (4096 bytes) Feb 13 08:28:38.550999 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Feb 13 08:28:38.551004 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 08:28:38.551009 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Feb 13 08:28:38.551014 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 08:28:38.551019 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 08:28:38.551025 kernel: audit: initializing netlink subsys (disabled) Feb 13 08:28:38.551030 kernel: audit: type=2000 audit(1707812912.040:1): state=initialized audit_enabled=0 res=1 Feb 13 08:28:38.551034 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 08:28:38.551039 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 08:28:38.551044 kernel: cpuidle: using governor menu Feb 13 08:28:38.551050 kernel: ACPI: bus type PCI registered Feb 13 08:28:38.551055 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 08:28:38.551060 kernel: dca service started, version 1.12.1 Feb 13 08:28:38.551065 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Feb 13 08:28:38.551070 kernel: PCI: MMCONFIG at [mem 0xe0000000-0xefffffff] reserved in E820 Feb 13 08:28:38.551075 kernel: PCI: Using configuration type 1 for base access Feb 13 08:28:38.551080 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Feb 13 08:28:38.551085 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 08:28:38.551090 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 08:28:38.551096 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 08:28:38.551101 kernel: ACPI: Added _OSI(Module Device) Feb 13 08:28:38.551106 kernel: ACPI: Added _OSI(Processor Device) Feb 13 08:28:38.551111 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 08:28:38.551116 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 08:28:38.551121 kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 13 08:28:38.551126 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 13 08:28:38.551131 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 13 08:28:38.551136 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Feb 13 08:28:38.551142 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:28:38.551147 kernel: ACPI: SSDT 0xFFFF987780213300 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Feb 13 08:28:38.551152 kernel: ACPI: \_SB_.PR00: _OSC native thermal LVT Acked Feb 13 08:28:38.551157 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:28:38.551162 kernel: ACPI: SSDT 0xFFFF987781AE4C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Feb 13 08:28:38.551167 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:28:38.551172 kernel: ACPI: SSDT 0xFFFF987781A5C000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Feb 13 08:28:38.551177 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:28:38.551182 kernel: ACPI: SSDT 0xFFFF987781A5D000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Feb 13 08:28:38.551187 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:28:38.551192 kernel: ACPI: SSDT 0xFFFF98778014A000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Feb 13 08:28:38.551197 kernel: ACPI: Dynamic OEM Table Load: Feb 13 08:28:38.551202 kernel: ACPI: SSDT 0xFFFF987781AE6400 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Feb 13 08:28:38.551207 kernel: ACPI: Interpreter enabled Feb 13 08:28:38.551212 kernel: ACPI: PM: (supports S0 S5) Feb 13 08:28:38.551217 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 08:28:38.551222 kernel: HEST: Enabling Firmware First mode for corrected errors. Feb 13 08:28:38.551227 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Feb 13 08:28:38.551232 kernel: HEST: Table parsing has been initialized. Feb 13 08:28:38.551238 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Feb 13 08:28:38.551243 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 08:28:38.551248 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Feb 13 08:28:38.551253 kernel: ACPI: PM: Power Resource [USBC] Feb 13 08:28:38.551258 kernel: ACPI: PM: Power Resource [V0PR] Feb 13 08:28:38.551263 kernel: ACPI: PM: Power Resource [V1PR] Feb 13 08:28:38.551268 kernel: ACPI: PM: Power Resource [V2PR] Feb 13 08:28:38.551273 kernel: ACPI: PM: Power Resource [WRST] Feb 13 08:28:38.551278 kernel: ACPI: PM: Power Resource [FN00] Feb 13 08:28:38.551284 kernel: ACPI: PM: Power Resource [FN01] Feb 13 08:28:38.551289 kernel: ACPI: PM: Power Resource [FN02] Feb 13 08:28:38.551294 kernel: ACPI: PM: Power Resource [FN03] Feb 13 08:28:38.551298 kernel: ACPI: PM: Power Resource [FN04] Feb 13 08:28:38.551303 kernel: ACPI: PM: Power Resource [PIN] Feb 13 08:28:38.551308 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Feb 13 08:28:38.551372 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 08:28:38.551437 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Feb 13 08:28:38.551496 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Feb 13 08:28:38.551503 kernel: PCI host bridge to bus 0000:00 Feb 13 08:28:38.551548 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 08:28:38.551585 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 08:28:38.551621 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 08:28:38.551656 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Feb 13 08:28:38.551690 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Feb 13 08:28:38.551728 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Feb 13 08:28:38.551777 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 Feb 13 08:28:38.551825 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 Feb 13 08:28:38.551867 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.551913 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 Feb 13 08:28:38.551955 kernel: pci 0000:00:08.0: reg 0x10: [mem 0x9551f000-0x9551ffff 64bit] Feb 13 08:28:38.552001 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 Feb 13 08:28:38.552043 kernel: pci 0000:00:12.0: reg 0x10: [mem 0x9551e000-0x9551efff 64bit] Feb 13 08:28:38.552089 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 Feb 13 08:28:38.552131 kernel: pci 0000:00:14.0: reg 0x10: [mem 0x95500000-0x9550ffff 64bit] Feb 13 08:28:38.552173 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Feb 13 08:28:38.552218 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 Feb 13 08:28:38.552261 kernel: pci 0000:00:14.2: reg 0x10: [mem 0x95512000-0x95513fff 64bit] Feb 13 08:28:38.552300 kernel: pci 0000:00:14.2: reg 0x18: [mem 0x9551d000-0x9551dfff 64bit] Feb 13 08:28:38.552345 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 Feb 13 08:28:38.552385 kernel: pci 0000:00:15.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 08:28:38.552448 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 Feb 13 08:28:38.552489 kernel: pci 0000:00:15.1: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 08:28:38.552536 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 Feb 13 08:28:38.552578 kernel: pci 0000:00:16.0: reg 0x10: [mem 0x9551a000-0x9551afff 64bit] Feb 13 08:28:38.552619 kernel: pci 0000:00:16.0: PME# supported from D3hot Feb 13 08:28:38.552663 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 Feb 13 08:28:38.552703 kernel: pci 0000:00:16.1: reg 0x10: [mem 0x95519000-0x95519fff 64bit] Feb 13 08:28:38.552745 kernel: pci 0000:00:16.1: PME# supported from D3hot Feb 13 08:28:38.552791 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 Feb 13 08:28:38.552834 kernel: pci 0000:00:16.4: reg 0x10: [mem 0x95518000-0x95518fff 64bit] Feb 13 08:28:38.552874 kernel: pci 0000:00:16.4: PME# supported from D3hot Feb 13 08:28:38.552919 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 Feb 13 08:28:38.552959 kernel: pci 0000:00:17.0: reg 0x10: [mem 0x95510000-0x95511fff] Feb 13 08:28:38.552999 kernel: pci 0000:00:17.0: reg 0x14: [mem 0x95517000-0x955170ff] Feb 13 08:28:38.553040 kernel: pci 0000:00:17.0: reg 0x18: [io 0x6050-0x6057] Feb 13 08:28:38.553080 kernel: pci 0000:00:17.0: reg 0x1c: [io 0x6040-0x6043] Feb 13 08:28:38.553128 kernel: pci 0000:00:17.0: reg 0x20: [io 0x6020-0x603f] Feb 13 08:28:38.553170 kernel: pci 0000:00:17.0: reg 0x24: [mem 0x95516000-0x955167ff] Feb 13 08:28:38.553212 kernel: pci 0000:00:17.0: PME# supported from D3hot Feb 13 08:28:38.553256 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 Feb 13 08:28:38.553300 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.553344 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 Feb 13 08:28:38.553387 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.553437 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 Feb 13 08:28:38.553479 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.553525 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 Feb 13 08:28:38.553566 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.553613 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 Feb 13 08:28:38.553657 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.553702 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 Feb 13 08:28:38.553745 kernel: pci 0000:00:1e.0: reg 0x10: [mem 0x00000000-0x00000fff 64bit] Feb 13 08:28:38.553792 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 Feb 13 08:28:38.553840 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 Feb 13 08:28:38.553881 kernel: pci 0000:00:1f.4: reg 0x10: [mem 0x95514000-0x955140ff 64bit] Feb 13 08:28:38.553923 kernel: pci 0000:00:1f.4: reg 0x20: [io 0xefa0-0xefbf] Feb 13 08:28:38.553967 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 Feb 13 08:28:38.554008 kernel: pci 0000:00:1f.5: reg 0x10: [mem 0xfe010000-0xfe010fff] Feb 13 08:28:38.554057 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 Feb 13 08:28:38.554102 kernel: pci 0000:01:00.0: reg 0x10: [mem 0x92000000-0x93ffffff 64bit pref] Feb 13 08:28:38.554145 kernel: pci 0000:01:00.0: reg 0x30: [mem 0x95200000-0x952fffff pref] Feb 13 08:28:38.554187 kernel: pci 0000:01:00.0: PME# supported from D3cold Feb 13 08:28:38.554230 kernel: pci 0000:01:00.0: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 08:28:38.554272 kernel: pci 0000:01:00.0: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 08:28:38.554319 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 Feb 13 08:28:38.554361 kernel: pci 0000:01:00.1: reg 0x10: [mem 0x90000000-0x91ffffff 64bit pref] Feb 13 08:28:38.554409 kernel: pci 0000:01:00.1: reg 0x30: [mem 0x95100000-0x951fffff pref] Feb 13 08:28:38.554452 kernel: pci 0000:01:00.1: PME# supported from D3cold Feb 13 08:28:38.554495 kernel: pci 0000:01:00.1: reg 0x1a4: [mem 0x00000000-0x000fffff 64bit pref] Feb 13 08:28:38.554537 kernel: pci 0000:01:00.1: VF(n) BAR0 space: [mem 0x00000000-0x007fffff 64bit pref] (contains BAR0 for 8 VFs) Feb 13 08:28:38.554580 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 08:28:38.554621 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 08:28:38.554682 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 08:28:38.554723 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 08:28:38.554773 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 Feb 13 08:28:38.554817 kernel: pci 0000:03:00.0: reg 0x10: [mem 0x95400000-0x9547ffff] Feb 13 08:28:38.554859 kernel: pci 0000:03:00.0: reg 0x18: [io 0x5000-0x501f] Feb 13 08:28:38.554902 kernel: pci 0000:03:00.0: reg 0x1c: [mem 0x95480000-0x95483fff] Feb 13 08:28:38.554945 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.554987 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 08:28:38.555030 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 08:28:38.555072 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 08:28:38.555118 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 Feb 13 08:28:38.555160 kernel: pci 0000:04:00.0: reg 0x10: [mem 0x95300000-0x9537ffff] Feb 13 08:28:38.555277 kernel: pci 0000:04:00.0: reg 0x18: [io 0x4000-0x401f] Feb 13 08:28:38.555320 kernel: pci 0000:04:00.0: reg 0x1c: [mem 0x95380000-0x95383fff] Feb 13 08:28:38.555363 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Feb 13 08:28:38.555403 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 08:28:38.555489 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 08:28:38.555531 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 08:28:38.555573 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 08:28:38.555618 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 Feb 13 08:28:38.555661 kernel: pci 0000:06:00.0: enabling Extended Tags Feb 13 08:28:38.555703 kernel: pci 0000:06:00.0: supports D1 D2 Feb 13 08:28:38.555746 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 08:28:38.555787 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 08:28:38.555827 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 08:28:38.555869 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:28:38.555913 kernel: pci_bus 0000:07: extended config space not accessible Feb 13 08:28:38.555963 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 Feb 13 08:28:38.556007 kernel: pci 0000:07:00.0: reg 0x10: [mem 0x94000000-0x94ffffff] Feb 13 08:28:38.556051 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x95000000-0x9501ffff] Feb 13 08:28:38.556094 kernel: pci 0000:07:00.0: reg 0x18: [io 0x3000-0x307f] Feb 13 08:28:38.556140 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 08:28:38.556185 kernel: pci 0000:07:00.0: supports D1 D2 Feb 13 08:28:38.556230 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 08:28:38.556272 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 08:28:38.556314 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 08:28:38.556356 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:28:38.556363 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Feb 13 08:28:38.556369 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Feb 13 08:28:38.556376 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Feb 13 08:28:38.556382 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Feb 13 08:28:38.556387 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Feb 13 08:28:38.556392 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Feb 13 08:28:38.556398 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Feb 13 08:28:38.556403 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Feb 13 08:28:38.556432 kernel: iommu: Default domain type: Translated Feb 13 08:28:38.556438 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 08:28:38.556503 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Feb 13 08:28:38.556549 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 08:28:38.556594 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Feb 13 08:28:38.556601 kernel: vgaarb: loaded Feb 13 08:28:38.556607 kernel: pps_core: LinuxPPS API ver. 1 registered Feb 13 08:28:38.556612 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 13 08:28:38.556618 kernel: PTP clock support registered Feb 13 08:28:38.556623 kernel: PCI: Using ACPI for IRQ routing Feb 13 08:28:38.556629 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 08:28:38.556634 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Feb 13 08:28:38.556641 kernel: e820: reserve RAM buffer [mem 0x820dd000-0x83ffffff] Feb 13 08:28:38.556646 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Feb 13 08:28:38.556651 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Feb 13 08:28:38.556657 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Feb 13 08:28:38.556662 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Feb 13 08:28:38.556667 kernel: clocksource: Switched to clocksource tsc-early Feb 13 08:28:38.556672 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 08:28:38.556678 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 08:28:38.556683 kernel: pnp: PnP ACPI init Feb 13 08:28:38.556730 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Feb 13 08:28:38.556771 kernel: pnp 00:02: [dma 0 disabled] Feb 13 08:28:38.556810 kernel: pnp 00:03: [dma 0 disabled] Feb 13 08:28:38.556849 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Feb 13 08:28:38.556886 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Feb 13 08:28:38.556926 kernel: system 00:05: [io 0x1854-0x1857] has been reserved Feb 13 08:28:38.556967 kernel: system 00:06: [mem 0xfed10000-0xfed17fff] has been reserved Feb 13 08:28:38.557005 kernel: system 00:06: [mem 0xfed18000-0xfed18fff] has been reserved Feb 13 08:28:38.557042 kernel: system 00:06: [mem 0xfed19000-0xfed19fff] has been reserved Feb 13 08:28:38.557077 kernel: system 00:06: [mem 0xe0000000-0xefffffff] has been reserved Feb 13 08:28:38.557114 kernel: system 00:06: [mem 0xfed20000-0xfed3ffff] has been reserved Feb 13 08:28:38.557149 kernel: system 00:06: [mem 0xfed90000-0xfed93fff] could not be reserved Feb 13 08:28:38.557185 kernel: system 00:06: [mem 0xfed45000-0xfed8ffff] has been reserved Feb 13 08:28:38.557223 kernel: system 00:06: [mem 0xfee00000-0xfeefffff] could not be reserved Feb 13 08:28:38.557265 kernel: system 00:07: [io 0x1800-0x18fe] could not be reserved Feb 13 08:28:38.557302 kernel: system 00:07: [mem 0xfd000000-0xfd69ffff] has been reserved Feb 13 08:28:38.557338 kernel: system 00:07: [mem 0xfd6c0000-0xfd6cffff] has been reserved Feb 13 08:28:38.557375 kernel: system 00:07: [mem 0xfd6f0000-0xfdffffff] has been reserved Feb 13 08:28:38.557434 kernel: system 00:07: [mem 0xfe000000-0xfe01ffff] could not be reserved Feb 13 08:28:38.557490 kernel: system 00:07: [mem 0xfe200000-0xfe7fffff] has been reserved Feb 13 08:28:38.557529 kernel: system 00:07: [mem 0xff000000-0xffffffff] has been reserved Feb 13 08:28:38.557568 kernel: system 00:08: [io 0x2000-0x20fe] has been reserved Feb 13 08:28:38.557576 kernel: pnp: PnP ACPI: found 10 devices Feb 13 08:28:38.557582 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 08:28:38.557587 kernel: NET: Registered PF_INET protocol family Feb 13 08:28:38.557593 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 08:28:38.557598 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Feb 13 08:28:38.557604 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 08:28:38.557611 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 08:28:38.557616 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 13 08:28:38.557621 kernel: TCP: Hash tables configured (established 262144 bind 65536) Feb 13 08:28:38.557627 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 08:28:38.557632 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 08:28:38.557638 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 08:28:38.557643 kernel: NET: Registered PF_XDP protocol family Feb 13 08:28:38.557685 kernel: pci 0000:00:15.0: BAR 0: assigned [mem 0x95515000-0x95515fff 64bit] Feb 13 08:28:38.557727 kernel: pci 0000:00:15.1: BAR 0: assigned [mem 0x9551b000-0x9551bfff 64bit] Feb 13 08:28:38.557768 kernel: pci 0000:00:1e.0: BAR 0: assigned [mem 0x9551c000-0x9551cfff 64bit] Feb 13 08:28:38.557811 kernel: pci 0000:01:00.0: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 08:28:38.557854 kernel: pci 0000:01:00.0: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 08:28:38.557896 kernel: pci 0000:01:00.1: BAR 7: no space for [mem size 0x00800000 64bit pref] Feb 13 08:28:38.557938 kernel: pci 0000:01:00.1: BAR 7: failed to assign [mem size 0x00800000 64bit pref] Feb 13 08:28:38.557979 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 08:28:38.558019 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Feb 13 08:28:38.558062 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 08:28:38.558101 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Feb 13 08:28:38.558143 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Feb 13 08:28:38.558182 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Feb 13 08:28:38.558224 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Feb 13 08:28:38.558266 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Feb 13 08:28:38.558306 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Feb 13 08:28:38.558347 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Feb 13 08:28:38.558387 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Feb 13 08:28:38.558455 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Feb 13 08:28:38.558497 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Feb 13 08:28:38.558540 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:28:38.558581 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Feb 13 08:28:38.558625 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Feb 13 08:28:38.558666 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Feb 13 08:28:38.558703 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Feb 13 08:28:38.558740 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 08:28:38.558775 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 08:28:38.558812 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 08:28:38.558847 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Feb 13 08:28:38.558883 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Feb 13 08:28:38.558925 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Feb 13 08:28:38.558967 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Feb 13 08:28:38.559009 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Feb 13 08:28:38.559048 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Feb 13 08:28:38.559089 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Feb 13 08:28:38.559128 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Feb 13 08:28:38.559171 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Feb 13 08:28:38.559210 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Feb 13 08:28:38.559250 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Feb 13 08:28:38.559290 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Feb 13 08:28:38.559298 kernel: PCI: CLS 64 bytes, default 64 Feb 13 08:28:38.559304 kernel: DMAR: No ATSR found Feb 13 08:28:38.559309 kernel: DMAR: No SATC found Feb 13 08:28:38.559315 kernel: DMAR: dmar0: Using Queued invalidation Feb 13 08:28:38.559356 kernel: pci 0000:00:00.0: Adding to iommu group 0 Feb 13 08:28:38.559401 kernel: pci 0000:00:01.0: Adding to iommu group 1 Feb 13 08:28:38.559445 kernel: pci 0000:00:08.0: Adding to iommu group 2 Feb 13 08:28:38.559488 kernel: pci 0000:00:12.0: Adding to iommu group 3 Feb 13 08:28:38.559528 kernel: pci 0000:00:14.0: Adding to iommu group 4 Feb 13 08:28:38.559571 kernel: pci 0000:00:14.2: Adding to iommu group 4 Feb 13 08:28:38.559612 kernel: pci 0000:00:15.0: Adding to iommu group 5 Feb 13 08:28:38.559673 kernel: pci 0000:00:15.1: Adding to iommu group 5 Feb 13 08:28:38.559713 kernel: pci 0000:00:16.0: Adding to iommu group 6 Feb 13 08:28:38.559756 kernel: pci 0000:00:16.1: Adding to iommu group 6 Feb 13 08:28:38.559797 kernel: pci 0000:00:16.4: Adding to iommu group 6 Feb 13 08:28:38.559837 kernel: pci 0000:00:17.0: Adding to iommu group 7 Feb 13 08:28:38.559878 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Feb 13 08:28:38.559918 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Feb 13 08:28:38.559959 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Feb 13 08:28:38.560000 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Feb 13 08:28:38.560040 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Feb 13 08:28:38.560082 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Feb 13 08:28:38.560122 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Feb 13 08:28:38.560163 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Feb 13 08:28:38.560203 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Feb 13 08:28:38.560246 kernel: pci 0000:01:00.0: Adding to iommu group 1 Feb 13 08:28:38.560288 kernel: pci 0000:01:00.1: Adding to iommu group 1 Feb 13 08:28:38.560330 kernel: pci 0000:03:00.0: Adding to iommu group 15 Feb 13 08:28:38.560374 kernel: pci 0000:04:00.0: Adding to iommu group 16 Feb 13 08:28:38.560443 kernel: pci 0000:06:00.0: Adding to iommu group 17 Feb 13 08:28:38.560507 kernel: pci 0000:07:00.0: Adding to iommu group 17 Feb 13 08:28:38.560515 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Feb 13 08:28:38.560521 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 13 08:28:38.560526 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Feb 13 08:28:38.560532 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Feb 13 08:28:38.560537 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Feb 13 08:28:38.560542 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Feb 13 08:28:38.560549 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Feb 13 08:28:38.560592 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Feb 13 08:28:38.560600 kernel: Initialise system trusted keyrings Feb 13 08:28:38.560605 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Feb 13 08:28:38.560611 kernel: Key type asymmetric registered Feb 13 08:28:38.560616 kernel: Asymmetric key parser 'x509' registered Feb 13 08:28:38.560621 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Feb 13 08:28:38.560627 kernel: io scheduler mq-deadline registered Feb 13 08:28:38.560634 kernel: io scheduler kyber registered Feb 13 08:28:38.560639 kernel: io scheduler bfq registered Feb 13 08:28:38.560679 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Feb 13 08:28:38.560721 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Feb 13 08:28:38.560762 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Feb 13 08:28:38.560802 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Feb 13 08:28:38.560843 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Feb 13 08:28:38.560884 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Feb 13 08:28:38.560930 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Feb 13 08:28:38.560938 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Feb 13 08:28:38.560944 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Feb 13 08:28:38.560949 kernel: pstore: Registered erst as persistent store backend Feb 13 08:28:38.560955 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 08:28:38.560960 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 08:28:38.560966 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 08:28:38.560971 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Feb 13 08:28:38.560978 kernel: hpet_acpi_add: no address or irqs in _CRS Feb 13 08:28:38.561019 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Feb 13 08:28:38.561027 kernel: i8042: PNP: No PS/2 controller found. Feb 13 08:28:38.561063 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Feb 13 08:28:38.561101 kernel: rtc_cmos rtc_cmos: registered as rtc0 Feb 13 08:28:38.561138 kernel: rtc_cmos rtc_cmos: setting system clock to 2024-02-13T08:28:37 UTC (1707812917) Feb 13 08:28:38.561175 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Feb 13 08:28:38.561182 kernel: fail to initialize ptp_kvm Feb 13 08:28:38.561189 kernel: intel_pstate: Intel P-state driver initializing Feb 13 08:28:38.561195 kernel: intel_pstate: Disabling energy efficiency optimization Feb 13 08:28:38.561200 kernel: intel_pstate: HWP enabled Feb 13 08:28:38.561206 kernel: vesafb: mode is 1024x768x8, linelength=1024, pages=0 Feb 13 08:28:38.561211 kernel: vesafb: scrolling: redraw Feb 13 08:28:38.561216 kernel: vesafb: Pseudocolor: size=0:8:8:8, shift=0:0:0:0 Feb 13 08:28:38.561222 kernel: vesafb: framebuffer at 0x94000000, mapped to 0x00000000c7fc9ee5, using 768k, total 768k Feb 13 08:28:38.561227 kernel: Console: switching to colour frame buffer device 128x48 Feb 13 08:28:38.561233 kernel: fb0: VESA VGA frame buffer device Feb 13 08:28:38.561239 kernel: NET: Registered PF_INET6 protocol family Feb 13 08:28:38.561244 kernel: Segment Routing with IPv6 Feb 13 08:28:38.561250 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 08:28:38.561255 kernel: NET: Registered PF_PACKET protocol family Feb 13 08:28:38.561261 kernel: Key type dns_resolver registered Feb 13 08:28:38.561266 kernel: microcode: sig=0x906ed, pf=0x2, revision=0xf4 Feb 13 08:28:38.561271 kernel: microcode: Microcode Update Driver: v2.2. Feb 13 08:28:38.561277 kernel: IPI shorthand broadcast: enabled Feb 13 08:28:38.561282 kernel: sched_clock: Marking stable (1678846018, 1338972218)->(4437452404, -1419634168) Feb 13 08:28:38.561288 kernel: registered taskstats version 1 Feb 13 08:28:38.561294 kernel: Loading compiled-in X.509 certificates Feb 13 08:28:38.561299 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.148-flatcar: 253e5c5c936b12e2ff2626e7f3214deb753330c8' Feb 13 08:28:38.561304 kernel: Key type .fscrypt registered Feb 13 08:28:38.561310 kernel: Key type fscrypt-provisioning registered Feb 13 08:28:38.561315 kernel: pstore: Using crash dump compression: deflate Feb 13 08:28:38.561320 kernel: ima: Allocated hash algorithm: sha1 Feb 13 08:28:38.561326 kernel: ima: No architecture policies found Feb 13 08:28:38.561331 kernel: Freeing unused kernel image (initmem) memory: 45496K Feb 13 08:28:38.561337 kernel: Write protecting the kernel read-only data: 28672k Feb 13 08:28:38.561343 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 13 08:28:38.561348 kernel: Freeing unused kernel image (rodata/data gap) memory: 636K Feb 13 08:28:38.561353 kernel: Run /init as init process Feb 13 08:28:38.561359 kernel: with arguments: Feb 13 08:28:38.561364 kernel: /init Feb 13 08:28:38.561369 kernel: with environment: Feb 13 08:28:38.561375 kernel: HOME=/ Feb 13 08:28:38.561381 kernel: TERM=linux Feb 13 08:28:38.561386 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 08:28:38.561393 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 08:28:38.561399 systemd[1]: Detected architecture x86-64. Feb 13 08:28:38.561405 systemd[1]: Running in initrd. Feb 13 08:28:38.561435 systemd[1]: No hostname configured, using default hostname. Feb 13 08:28:38.561440 systemd[1]: Hostname set to . Feb 13 08:28:38.561446 systemd[1]: Initializing machine ID from random generator. Feb 13 08:28:38.561470 systemd[1]: Queued start job for default target initrd.target. Feb 13 08:28:38.561475 systemd[1]: Started systemd-ask-password-console.path. Feb 13 08:28:38.561481 systemd[1]: Reached target cryptsetup.target. Feb 13 08:28:38.561486 systemd[1]: Reached target paths.target. Feb 13 08:28:38.561492 systemd[1]: Reached target slices.target. Feb 13 08:28:38.561497 systemd[1]: Reached target swap.target. Feb 13 08:28:38.561503 systemd[1]: Reached target timers.target. Feb 13 08:28:38.561508 systemd[1]: Listening on iscsid.socket. Feb 13 08:28:38.561515 systemd[1]: Listening on iscsiuio.socket. Feb 13 08:28:38.561521 systemd[1]: Listening on systemd-journald-audit.socket. Feb 13 08:28:38.561526 systemd[1]: Listening on systemd-journald-dev-log.socket. Feb 13 08:28:38.561532 systemd[1]: Listening on systemd-journald.socket. Feb 13 08:28:38.561537 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Feb 13 08:28:38.561543 systemd[1]: Listening on systemd-networkd.socket. Feb 13 08:28:38.561548 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Feb 13 08:28:38.561554 kernel: clocksource: Switched to clocksource tsc Feb 13 08:28:38.561560 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 08:28:38.561566 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 08:28:38.561571 systemd[1]: Reached target sockets.target. Feb 13 08:28:38.561577 systemd[1]: Starting kmod-static-nodes.service... Feb 13 08:28:38.561582 systemd[1]: Finished network-cleanup.service. Feb 13 08:28:38.561588 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 08:28:38.561593 systemd[1]: Starting systemd-journald.service... Feb 13 08:28:38.561599 systemd[1]: Starting systemd-modules-load.service... Feb 13 08:28:38.561607 systemd-journald[267]: Journal started Feb 13 08:28:38.561632 systemd-journald[267]: Runtime Journal (/run/log/journal/c4b3e0a30ed8430a9e6721c272751887) is 8.0M, max 640.1M, 632.1M free. Feb 13 08:28:38.564140 systemd-modules-load[268]: Inserted module 'overlay' Feb 13 08:28:38.593702 kernel: audit: type=1334 audit(1707812918.569:2): prog-id=6 op=LOAD Feb 13 08:28:38.593712 systemd[1]: Starting systemd-resolved.service... Feb 13 08:28:38.569000 audit: BPF prog-id=6 op=LOAD Feb 13 08:28:38.637447 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 08:28:38.637464 systemd[1]: Starting systemd-vconsole-setup.service... Feb 13 08:28:38.669444 kernel: Bridge firewalling registered Feb 13 08:28:38.669462 systemd[1]: Started systemd-journald.service. Feb 13 08:28:38.683560 systemd-modules-load[268]: Inserted module 'br_netfilter' Feb 13 08:28:38.731650 kernel: audit: type=1130 audit(1707812918.690:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.689474 systemd-resolved[270]: Positive Trust Anchors: Feb 13 08:28:38.807480 kernel: SCSI subsystem initialized Feb 13 08:28:38.807493 kernel: audit: type=1130 audit(1707812918.742:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.807502 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 08:28:38.742000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.689480 systemd-resolved[270]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 08:28:38.909539 kernel: device-mapper: uevent: version 1.0.3 Feb 13 08:28:38.909550 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Feb 13 08:28:38.909557 kernel: audit: type=1130 audit(1707812918.864:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.689500 systemd-resolved[270]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 08:28:38.982655 kernel: audit: type=1130 audit(1707812918.917:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.917000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.691017 systemd-resolved[270]: Defaulting to hostname 'linux'. Feb 13 08:28:38.991000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.691783 systemd[1]: Started systemd-resolved.service. Feb 13 08:28:39.097632 kernel: audit: type=1130 audit(1707812918.991:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.097644 kernel: audit: type=1130 audit(1707812919.044:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.044000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:38.743586 systemd[1]: Finished kmod-static-nodes.service. Feb 13 08:28:38.865720 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 08:28:38.911489 systemd-modules-load[268]: Inserted module 'dm_multipath' Feb 13 08:28:38.918057 systemd[1]: Finished systemd-modules-load.service. Feb 13 08:28:38.991858 systemd[1]: Finished systemd-vconsole-setup.service. Feb 13 08:28:39.044784 systemd[1]: Reached target nss-lookup.target. Feb 13 08:28:39.107008 systemd[1]: Starting dracut-cmdline-ask.service... Feb 13 08:28:39.136142 systemd[1]: Starting systemd-sysctl.service... Feb 13 08:28:39.136564 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 13 08:28:39.139727 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 13 08:28:39.139000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.140362 systemd[1]: Finished systemd-sysctl.service. Feb 13 08:28:39.190601 kernel: audit: type=1130 audit(1707812919.139:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.202753 systemd[1]: Finished dracut-cmdline-ask.service. Feb 13 08:28:39.268486 kernel: audit: type=1130 audit(1707812919.202:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.259000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.260108 systemd[1]: Starting dracut-cmdline.service... Feb 13 08:28:39.284509 dracut-cmdline[292]: dracut-dracut-053 Feb 13 08:28:39.284509 dracut-cmdline[292]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LA Feb 13 08:28:39.284509 dracut-cmdline[292]: BEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=f2beb0668e3dab90bbcf0ace3803b7ee02142bfb86913ef12ef6d2ee81a411a4 Feb 13 08:28:39.351498 kernel: Loading iSCSI transport class v2.0-870. Feb 13 08:28:39.351513 kernel: iscsi: registered transport (tcp) Feb 13 08:28:39.398342 kernel: iscsi: registered transport (qla4xxx) Feb 13 08:28:39.398362 kernel: QLogic iSCSI HBA Driver Feb 13 08:28:39.414642 systemd[1]: Finished dracut-cmdline.service. Feb 13 08:28:39.423000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:39.424100 systemd[1]: Starting dracut-pre-udev.service... Feb 13 08:28:39.479411 kernel: raid6: avx2x4 gen() 45451 MB/s Feb 13 08:28:39.514410 kernel: raid6: avx2x4 xor() 22474 MB/s Feb 13 08:28:39.549410 kernel: raid6: avx2x2 gen() 54893 MB/s Feb 13 08:28:39.584410 kernel: raid6: avx2x2 xor() 32796 MB/s Feb 13 08:28:39.619409 kernel: raid6: avx2x1 gen() 46246 MB/s Feb 13 08:28:39.653409 kernel: raid6: avx2x1 xor() 28523 MB/s Feb 13 08:28:39.687410 kernel: raid6: sse2x4 gen() 21834 MB/s Feb 13 08:28:39.721410 kernel: raid6: sse2x4 xor() 11984 MB/s Feb 13 08:28:39.755409 kernel: raid6: sse2x2 gen() 22133 MB/s Feb 13 08:28:39.789410 kernel: raid6: sse2x2 xor() 13712 MB/s Feb 13 08:28:39.823409 kernel: raid6: sse2x1 gen() 18691 MB/s Feb 13 08:28:39.874986 kernel: raid6: sse2x1 xor() 9123 MB/s Feb 13 08:28:39.875012 kernel: raid6: using algorithm avx2x2 gen() 54893 MB/s Feb 13 08:28:39.875025 kernel: raid6: .... xor() 32796 MB/s, rmw enabled Feb 13 08:28:39.893035 kernel: raid6: using avx2x2 recovery algorithm Feb 13 08:28:39.939412 kernel: xor: automatically using best checksumming function avx Feb 13 08:28:40.017417 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Feb 13 08:28:40.022089 systemd[1]: Finished dracut-pre-udev.service. Feb 13 08:28:40.030000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:40.030000 audit: BPF prog-id=7 op=LOAD Feb 13 08:28:40.030000 audit: BPF prog-id=8 op=LOAD Feb 13 08:28:40.031431 systemd[1]: Starting systemd-udevd.service... Feb 13 08:28:40.038944 systemd-udevd[474]: Using default interface naming scheme 'v252'. Feb 13 08:28:40.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:40.046822 systemd[1]: Started systemd-udevd.service. Feb 13 08:28:40.087526 dracut-pre-trigger[488]: rd.md=0: removing MD RAID activation Feb 13 08:28:40.064225 systemd[1]: Starting dracut-pre-trigger.service... Feb 13 08:28:40.104000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:40.091503 systemd[1]: Finished dracut-pre-trigger.service. Feb 13 08:28:40.105257 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 08:28:40.176013 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 08:28:40.174000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:40.203416 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 08:28:40.206415 kernel: libata version 3.00 loaded. Feb 13 08:28:40.241891 kernel: ACPI: bus type USB registered Feb 13 08:28:40.241945 kernel: usbcore: registered new interface driver usbfs Feb 13 08:28:40.241955 kernel: usbcore: registered new interface driver hub Feb 13 08:28:40.276824 kernel: usbcore: registered new device driver usb Feb 13 08:28:40.277413 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 08:28:40.309842 kernel: AES CTR mode by8 optimization enabled Feb 13 08:28:40.331731 kernel: ahci 0000:00:17.0: version 3.0 Feb 13 08:28:40.331820 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 08:28:40.331875 kernel: ahci 0000:00:17.0: AHCI 0001.0301 32 slots 7 ports 6 Gbps 0x7f impl SATA mode Feb 13 08:28:40.331923 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Feb 13 08:28:40.351353 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Feb 13 08:28:40.390444 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Feb 13 08:28:40.410412 kernel: scsi host0: ahci Feb 13 08:28:40.440321 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Feb 13 08:28:40.440393 kernel: scsi host1: ahci Feb 13 08:28:40.440413 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Feb 13 08:28:40.484673 kernel: scsi host2: ahci Feb 13 08:28:40.484773 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Feb 13 08:28:40.484838 kernel: scsi host3: ahci Feb 13 08:28:40.514298 kernel: hub 1-0:1.0: USB hub found Feb 13 08:28:40.527436 kernel: scsi host4: ahci Feb 13 08:28:40.527466 kernel: hub 1-0:1.0: 16 ports detected Feb 13 08:28:40.551955 kernel: scsi host5: ahci Feb 13 08:28:40.574811 kernel: hub 2-0:1.0: USB hub found Feb 13 08:28:40.574890 kernel: scsi host6: ahci Feb 13 08:28:40.585232 kernel: hub 2-0:1.0: 10 ports detected Feb 13 08:28:40.585311 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 Feb 13 08:28:40.611368 kernel: usb: port power management may be unreliable Feb 13 08:28:40.611385 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 Feb 13 08:28:40.654112 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 Feb 13 08:28:40.654128 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 Feb 13 08:28:40.669006 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 Feb 13 08:28:40.683893 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 Feb 13 08:28:40.698773 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 Feb 13 08:28:40.714412 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Feb 13 08:28:40.740665 kernel: mlx5_core 0000:01:00.0: firmware version: 14.27.1016 Feb 13 08:28:40.740739 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Feb 13 08:28:40.740747 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 08:28:40.815236 kernel: pps pps0: new PPS source ptp0 Feb 13 08:28:40.815311 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Feb 13 08:28:40.815328 kernel: igb 0000:03:00.0: added PHC on eth0 Feb 13 08:28:40.844076 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 08:28:40.844149 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6b:0a:d0 Feb 13 08:28:40.874074 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Feb 13 08:28:40.874143 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 08:28:40.945465 kernel: pps pps1: new PPS source ptp1 Feb 13 08:28:40.945575 kernel: igb 0000:04:00.0: added PHC on eth1 Feb 13 08:28:40.945655 kernel: hub 1-14:1.0: USB hub found Feb 13 08:28:40.945746 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Feb 13 08:28:40.987156 kernel: hub 1-14:1.0: 4 ports detected Feb 13 08:28:40.987246 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6b:0a:d1 Feb 13 08:28:41.018249 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Feb 13 08:28:41.018333 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Feb 13 08:28:41.030412 kernel: ata6: SATA link down (SStatus 0 SControl 300) Feb 13 08:28:41.030429 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Feb 13 08:28:41.050458 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 08:28:41.050709 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 08:28:41.102411 kernel: ata3: SATA link down (SStatus 0 SControl 300) Feb 13 08:28:41.117439 kernel: ata7: SATA link down (SStatus 0 SControl 300) Feb 13 08:28:41.131410 kernel: ata5: SATA link down (SStatus 0 SControl 300) Feb 13 08:28:41.145450 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Feb 13 08:28:41.161441 kernel: ata4: SATA link down (SStatus 0 SControl 300) Feb 13 08:28:41.176465 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Feb 13 08:28:41.207199 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Feb 13 08:28:41.236725 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 08:28:41.236766 kernel: ata2.00: Features: NCQ-prio Feb 13 08:28:41.267867 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Feb 13 08:28:41.267907 kernel: ata1.00: Features: NCQ-prio Feb 13 08:28:41.271479 kernel: mlx5_core 0000:01:00.0: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 08:28:41.271612 kernel: ata2.00: configured for UDMA/133 Feb 13 08:28:41.271621 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Feb 13 08:28:41.301411 kernel: mlx5_core 0000:01:00.1: firmware version: 14.27.1016 Feb 13 08:28:41.317448 kernel: ata1.00: configured for UDMA/133 Feb 13 08:28:41.317467 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Feb 13 08:28:41.333448 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Feb 13 08:28:41.401416 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Feb 13 08:28:41.420416 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Feb 13 08:28:41.436412 kernel: hid: raw HID events driver (C) Jiri Kosina Feb 13 08:28:41.469621 kernel: usbcore: registered new interface driver usbhid Feb 13 08:28:41.469663 kernel: usbhid: USB HID core driver Feb 13 08:28:41.470411 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 08:28:41.485069 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:41.499725 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 08:28:41.499806 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Feb 13 08:28:41.534814 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Feb 13 08:28:41.534895 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Feb 13 08:28:41.534959 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Feb 13 08:28:41.549748 kernel: sd 1:0:0:0: [sda] Write Protect is off Feb 13 08:28:41.565414 kernel: sd 0:0:0:0: [sdb] Write Protect is off Feb 13 08:28:41.587469 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Feb 13 08:28:41.587553 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Feb 13 08:28:41.587563 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Feb 13 08:28:41.610477 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Feb 13 08:28:41.610555 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 08:28:41.610624 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Feb 13 08:28:41.624411 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Feb 13 08:28:41.628444 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 08:28:41.628460 kernel: port_module: 9 callbacks suppressed Feb 13 08:28:41.628468 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Feb 13 08:28:41.690170 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Feb 13 08:28:41.690248 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Feb 13 08:28:41.690311 kernel: ata2.00: Enabling discard_zeroes_data Feb 13 08:28:41.690320 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Feb 13 08:28:41.902961 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:41.925412 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Feb 13 08:28:41.925487 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 08:28:41.960361 kernel: GPT:9289727 != 937703087 Feb 13 08:28:41.960376 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 08:28:41.978357 kernel: GPT:9289727 != 937703087 Feb 13 08:28:41.993746 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 08:28:42.027486 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 08:28:42.044249 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:42.044264 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Feb 13 08:28:42.093866 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Feb 13 08:28:42.150413 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sdb6 scanned by (udev-worker) (559) Feb 13 08:28:42.150427 kernel: mlx5_core 0000:01:00.1: Supported tc offload range - chains: 4294967294, prios: 4294967295 Feb 13 08:28:42.150498 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth2 Feb 13 08:28:42.133547 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Feb 13 08:28:42.189521 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth0 Feb 13 08:28:42.175065 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Feb 13 08:28:42.196207 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Feb 13 08:28:42.240490 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 08:28:42.250900 systemd[1]: Starting disk-uuid.service... Feb 13 08:28:42.302503 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:42.302518 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 08:28:42.302527 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:42.302586 disk-uuid[690]: Primary Header is updated. Feb 13 08:28:42.302586 disk-uuid[690]: Secondary Entries is updated. Feb 13 08:28:42.302586 disk-uuid[690]: Secondary Header is updated. Feb 13 08:28:42.358498 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 08:28:42.358508 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:42.358515 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 08:28:43.344573 kernel: ata1.00: Enabling discard_zeroes_data Feb 13 08:28:43.363172 disk-uuid[691]: The operation has completed successfully. Feb 13 08:28:43.372531 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Feb 13 08:28:43.397788 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 08:28:43.494533 kernel: audit: type=1130 audit(1707812923.405:19): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.494548 kernel: audit: type=1131 audit(1707812923.405:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.405000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.405000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.397831 systemd[1]: Finished disk-uuid.service. Feb 13 08:28:43.525593 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 08:28:43.411124 systemd[1]: Starting verity-setup.service... Feb 13 08:28:43.604223 systemd[1]: Found device dev-mapper-usr.device. Feb 13 08:28:43.615975 systemd[1]: Mounting sysusr-usr.mount... Feb 13 08:28:43.628103 systemd[1]: Finished verity-setup.service. Feb 13 08:28:43.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.691414 kernel: audit: type=1130 audit(1707812923.641:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.747091 systemd[1]: Mounted sysusr-usr.mount. Feb 13 08:28:43.762534 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Feb 13 08:28:43.755769 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Feb 13 08:28:43.845259 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 08:28:43.845278 kernel: BTRFS info (device sdb6): using free space tree Feb 13 08:28:43.845286 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 08:28:43.845293 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 08:28:43.756151 systemd[1]: Starting ignition-setup.service... Feb 13 08:28:43.775708 systemd[1]: Starting parse-ip-for-networkd.service... Feb 13 08:28:43.918440 kernel: audit: type=1130 audit(1707812923.872:22): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.872000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.854789 systemd[1]: Finished ignition-setup.service. Feb 13 08:28:43.927000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:43.872764 systemd[1]: Finished parse-ip-for-networkd.service. Feb 13 08:28:44.003637 kernel: audit: type=1130 audit(1707812923.927:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.003651 kernel: audit: type=1334 audit(1707812923.981:24): prog-id=9 op=LOAD Feb 13 08:28:43.981000 audit: BPF prog-id=9 op=LOAD Feb 13 08:28:43.928076 systemd[1]: Starting ignition-fetch-offline.service... Feb 13 08:28:43.983216 systemd[1]: Starting systemd-networkd.service... Feb 13 08:28:44.017123 systemd-networkd[876]: lo: Link UP Feb 13 08:28:44.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.051489 ignition[867]: Ignition 2.14.0 Feb 13 08:28:44.094644 kernel: audit: type=1130 audit(1707812924.032:25): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.017125 systemd-networkd[876]: lo: Gained carrier Feb 13 08:28:44.051493 ignition[867]: Stage: fetch-offline Feb 13 08:28:44.108000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.017411 systemd-networkd[876]: Enumeration completed Feb 13 08:28:44.232483 kernel: audit: type=1130 audit(1707812924.108:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.232495 kernel: audit: type=1130 audit(1707812924.164:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.232502 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 08:28:44.164000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.051520 ignition[867]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:28:44.255492 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f1np1: link becomes ready Feb 13 08:28:44.017496 systemd[1]: Started systemd-networkd.service. Feb 13 08:28:44.051535 ignition[867]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:28:44.018229 systemd-networkd[876]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:28:44.283000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.059426 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:28:44.314518 iscsid[907]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 13 08:28:44.314518 iscsid[907]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log Feb 13 08:28:44.314518 iscsid[907]: into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 13 08:28:44.314518 iscsid[907]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 13 08:28:44.314518 iscsid[907]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 13 08:28:44.314518 iscsid[907]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 13 08:28:44.314518 iscsid[907]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 13 08:28:44.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:44.033511 systemd[1]: Reached target network.target. Feb 13 08:28:44.474553 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 08:28:44.059492 ignition[867]: parsed url from cmdline: "" Feb 13 08:28:44.079819 unknown[867]: fetched base config from "system" Feb 13 08:28:44.059494 ignition[867]: no config URL provided Feb 13 08:28:44.079823 unknown[867]: fetched user config from "system" Feb 13 08:28:44.059497 ignition[867]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 08:28:44.089080 systemd[1]: Starting iscsiuio.service... Feb 13 08:28:44.059529 ignition[867]: parsing config with SHA512: 61cf81d2d459ff89d1e107fc35edb680269e575b125dbc40ff9dcb4335360fbbc613c6eb360bfb561ee096e5885ec92b373c67c54a1d05f0bd2b154f0c1f8fe5 Feb 13 08:28:44.101667 systemd[1]: Started iscsiuio.service. Feb 13 08:28:44.080168 ignition[867]: fetch-offline: fetch-offline passed Feb 13 08:28:44.108766 systemd[1]: Finished ignition-fetch-offline.service. Feb 13 08:28:44.080171 ignition[867]: POST message to Packet Timeline Feb 13 08:28:44.164658 systemd[1]: ignition-fetch.service was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 08:28:44.080176 ignition[867]: POST Status error: resource requires networking Feb 13 08:28:44.165098 systemd[1]: Starting ignition-kargs.service... Feb 13 08:28:44.080206 ignition[867]: Ignition finished successfully Feb 13 08:28:44.233646 systemd-networkd[876]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:28:44.236999 ignition[895]: Ignition 2.14.0 Feb 13 08:28:44.247037 systemd[1]: Starting iscsid.service... Feb 13 08:28:44.237003 ignition[895]: Stage: kargs Feb 13 08:28:44.270655 systemd[1]: Started iscsid.service. Feb 13 08:28:44.237057 ignition[895]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:28:44.285008 systemd[1]: Starting dracut-initqueue.service... Feb 13 08:28:44.237067 ignition[895]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:28:44.304603 systemd[1]: Finished dracut-initqueue.service. Feb 13 08:28:44.238389 ignition[895]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:28:44.322520 systemd[1]: Reached target remote-fs-pre.target. Feb 13 08:28:44.240312 ignition[895]: kargs: kargs passed Feb 13 08:28:44.333608 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 08:28:44.240316 ignition[895]: POST message to Packet Timeline Feb 13 08:28:44.376631 systemd[1]: Reached target remote-fs.target. Feb 13 08:28:44.240326 ignition[895]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:28:44.400688 systemd[1]: Starting dracut-pre-mount.service... Feb 13 08:28:44.243629 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47016->[::1]:53: read: connection refused Feb 13 08:28:44.430690 systemd[1]: Finished dracut-pre-mount.service. Feb 13 08:28:44.444068 ignition[895]: GET https://metadata.packet.net/metadata: attempt #2 Feb 13 08:28:44.469072 systemd-networkd[876]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:28:44.444535 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:49973->[::1]:53: read: connection refused Feb 13 08:28:44.499029 systemd-networkd[876]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 08:28:44.529621 systemd-networkd[876]: enp1s0f1np1: Link UP Feb 13 08:28:44.530033 systemd-networkd[876]: enp1s0f1np1: Gained carrier Feb 13 08:28:44.537887 systemd-networkd[876]: enp1s0f0np0: Link UP Feb 13 08:28:44.538238 systemd-networkd[876]: eno2: Link UP Feb 13 08:28:44.538593 systemd-networkd[876]: eno1: Link UP Feb 13 08:28:44.845303 ignition[895]: GET https://metadata.packet.net/metadata: attempt #3 Feb 13 08:28:44.846642 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:40530->[::1]:53: read: connection refused Feb 13 08:28:45.312090 systemd-networkd[876]: enp1s0f0np0: Gained carrier Feb 13 08:28:45.321666 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): enp1s0f0np0: link becomes ready Feb 13 08:28:45.344618 systemd-networkd[876]: enp1s0f0np0: DHCPv4 address 145.40.67.89/31, gateway 145.40.67.88 acquired from 145.40.83.140 Feb 13 08:28:45.610899 systemd-networkd[876]: enp1s0f1np1: Gained IPv6LL Feb 13 08:28:45.646975 ignition[895]: GET https://metadata.packet.net/metadata: attempt #4 Feb 13 08:28:45.648227 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54765->[::1]:53: read: connection refused Feb 13 08:28:46.762996 systemd-networkd[876]: enp1s0f0np0: Gained IPv6LL Feb 13 08:28:47.249757 ignition[895]: GET https://metadata.packet.net/metadata: attempt #5 Feb 13 08:28:47.250968 ignition[895]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47304->[::1]:53: read: connection refused Feb 13 08:28:50.454439 ignition[895]: GET https://metadata.packet.net/metadata: attempt #6 Feb 13 08:28:50.489159 ignition[895]: GET result: OK Feb 13 08:28:50.678324 ignition[895]: Ignition finished successfully Feb 13 08:28:50.682876 systemd[1]: Finished ignition-kargs.service. Feb 13 08:28:50.765795 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:28:50.765814 kernel: audit: type=1130 audit(1707812930.694:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:50.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:50.703782 ignition[923]: Ignition 2.14.0 Feb 13 08:28:50.696682 systemd[1]: Starting ignition-disks.service... Feb 13 08:28:50.703785 ignition[923]: Stage: disks Feb 13 08:28:50.703858 ignition[923]: reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:28:50.703867 ignition[923]: parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:28:50.705598 ignition[923]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:28:50.707312 ignition[923]: disks: disks passed Feb 13 08:28:50.707315 ignition[923]: POST message to Packet Timeline Feb 13 08:28:50.707326 ignition[923]: GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:28:50.730722 ignition[923]: GET result: OK Feb 13 08:28:50.938134 ignition[923]: Ignition finished successfully Feb 13 08:28:50.941469 systemd[1]: Finished ignition-disks.service. Feb 13 08:28:50.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:50.954949 systemd[1]: Reached target initrd-root-device.target. Feb 13 08:28:51.038630 kernel: audit: type=1130 audit(1707812930.954:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.024588 systemd[1]: Reached target local-fs-pre.target. Feb 13 08:28:51.024628 systemd[1]: Reached target local-fs.target. Feb 13 08:28:51.046603 systemd[1]: Reached target sysinit.target. Feb 13 08:28:51.060605 systemd[1]: Reached target basic.target. Feb 13 08:28:51.075484 systemd[1]: Starting systemd-fsck-root.service... Feb 13 08:28:51.097690 systemd-fsck[941]: ROOT: clean, 602/553520 files, 56013/553472 blocks Feb 13 08:28:51.110162 systemd[1]: Finished systemd-fsck-root.service. Feb 13 08:28:51.198359 kernel: audit: type=1130 audit(1707812931.117:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.198374 kernel: EXT4-fs (sdb9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Feb 13 08:28:51.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.124699 systemd[1]: Mounting sysroot.mount... Feb 13 08:28:51.206789 systemd[1]: Mounted sysroot.mount. Feb 13 08:28:51.220754 systemd[1]: Reached target initrd-root-fs.target. Feb 13 08:28:51.238187 systemd[1]: Mounting sysroot-usr.mount... Feb 13 08:28:51.255638 systemd[1]: Starting flatcar-metadata-hostname.service... Feb 13 08:28:51.270476 systemd[1]: Starting flatcar-static-network.service... Feb 13 08:28:51.284755 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 08:28:51.284848 systemd[1]: Reached target ignition-diskful.target. Feb 13 08:28:51.303797 systemd[1]: Mounted sysroot-usr.mount. Feb 13 08:28:51.326872 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 08:28:51.339165 systemd[1]: Starting initrd-setup-root.service... Feb 13 08:28:51.467482 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sdb6 scanned by mount (952) Feb 13 08:28:51.467498 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 08:28:51.467512 kernel: BTRFS info (device sdb6): using free space tree Feb 13 08:28:51.467519 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 08:28:51.467526 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 08:28:51.402101 systemd[1]: Finished initrd-setup-root.service. Feb 13 08:28:51.528450 kernel: audit: type=1130 audit(1707812931.474:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.528498 coreos-metadata[948]: Feb 13 08:28:51.432 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:28:51.528498 coreos-metadata[948]: Feb 13 08:28:51.454 INFO Fetch successful Feb 13 08:28:51.528498 coreos-metadata[948]: Feb 13 08:28:51.476 INFO wrote hostname ci-3510.3.2-a-5e41ede811 to /sysroot/etc/hostname Feb 13 08:28:51.734658 kernel: audit: type=1130 audit(1707812931.537:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.734671 kernel: audit: type=1130 audit(1707812931.600:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.734681 kernel: audit: type=1131 audit(1707812931.600:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.537000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.600000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.600000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-static-network comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.734737 coreos-metadata[949]: Feb 13 08:28:51.432 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:28:51.734737 coreos-metadata[949]: Feb 13 08:28:51.454 INFO Fetch successful Feb 13 08:28:51.753546 initrd-setup-root[958]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 08:28:51.497009 systemd[1]: Finished flatcar-metadata-hostname.service. Feb 13 08:28:51.784000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.813683 initrd-setup-root[968]: cut: /sysroot/etc/group: No such file or directory Feb 13 08:28:51.851651 kernel: audit: type=1130 audit(1707812931.784:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:51.537747 systemd[1]: flatcar-static-network.service: Deactivated successfully. Feb 13 08:28:51.861668 initrd-setup-root[976]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 08:28:51.537786 systemd[1]: Finished flatcar-static-network.service. Feb 13 08:28:51.879648 initrd-setup-root[984]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 08:28:51.600692 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 08:28:51.897654 ignition[1025]: INFO : Ignition 2.14.0 Feb 13 08:28:51.897654 ignition[1025]: INFO : Stage: mount Feb 13 08:28:51.897654 ignition[1025]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:28:51.897654 ignition[1025]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:28:51.897654 ignition[1025]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:28:51.897654 ignition[1025]: INFO : mount: mount passed Feb 13 08:28:51.897654 ignition[1025]: INFO : POST message to Packet Timeline Feb 13 08:28:51.897654 ignition[1025]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:28:51.897654 ignition[1025]: INFO : GET result: OK Feb 13 08:28:51.722065 systemd[1]: Starting ignition-mount.service... Feb 13 08:28:51.742040 systemd[1]: Starting sysroot-boot.service... Feb 13 08:28:51.768516 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Feb 13 08:28:51.768562 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Feb 13 08:28:51.769307 systemd[1]: Finished sysroot-boot.service. Feb 13 08:28:52.055243 ignition[1025]: INFO : Ignition finished successfully Feb 13 08:28:52.057991 systemd[1]: Finished ignition-mount.service. Feb 13 08:28:52.071000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:52.073554 systemd[1]: Starting ignition-files.service... Feb 13 08:28:52.143608 kernel: audit: type=1130 audit(1707812932.071:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:52.138220 systemd[1]: Mounting sysroot-usr-share-oem.mount... Feb 13 08:28:52.192501 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sdb6 scanned by mount (1038) Feb 13 08:28:52.192512 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Feb 13 08:28:52.224333 kernel: BTRFS info (device sdb6): using free space tree Feb 13 08:28:52.224347 kernel: BTRFS info (device sdb6): has skinny extents Feb 13 08:28:52.273466 kernel: BTRFS info (device sdb6): enabling ssd optimizations Feb 13 08:28:52.275029 systemd[1]: Mounted sysroot-usr-share-oem.mount. Feb 13 08:28:52.291535 ignition[1057]: INFO : Ignition 2.14.0 Feb 13 08:28:52.291535 ignition[1057]: INFO : Stage: files Feb 13 08:28:52.291535 ignition[1057]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:28:52.291535 ignition[1057]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:28:52.291535 ignition[1057]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:28:52.291535 ignition[1057]: DEBUG : files: compiled without relabeling support, skipping Feb 13 08:28:52.294524 unknown[1057]: wrote ssh authorized keys file for user: core Feb 13 08:28:52.368596 ignition[1057]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 08:28:52.368596 ignition[1057]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 08:28:52.368596 ignition[1057]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 08:28:52.368596 ignition[1057]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 08:28:52.368596 ignition[1057]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 08:28:52.368596 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 08:28:52.368596 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 08:28:52.471937 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 08:28:52.533970 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 08:28:52.550613 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 13 08:28:52.550613 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://github.com/containernetworking/plugins/releases/download/v1.1.1/cni-plugins-linux-amd64-v1.1.1.tgz: attempt #1 Feb 13 08:28:53.034815 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Feb 13 08:28:53.113194 ignition[1057]: DEBUG : files: createFilesystemsFiles: createFiles: op(4): file matches expected sum of: 4d0ed0abb5951b9cf83cba938ef84bdc5b681f4ac869da8143974f6a53a3ff30c666389fa462b9d14d30af09bf03f6cdf77598c572f8fb3ea00cecdda467a48d Feb 13 08:28:53.113194 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/cni-plugins-linux-amd64-v1.1.1.tgz" Feb 13 08:28:53.156643 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 13 08:28:53.156643 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET https://github.com/kubernetes-sigs/cri-tools/releases/download/v1.26.0/crictl-v1.26.0-linux-amd64.tar.gz: attempt #1 Feb 13 08:28:53.560177 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(5): GET result: OK Feb 13 08:28:53.612281 ignition[1057]: DEBUG : files: createFilesystemsFiles: createFiles: op(5): file matches expected sum of: a3a2c02a90b008686c20babaf272e703924db2a3e2a0d4e2a7c81d994cbc68c47458a4a354ecc243af095b390815c7f203348b9749351ae817bd52a522300449 Feb 13 08:28:53.612281 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/opt/crictl-v1.26.0-linux-amd64.tar.gz" Feb 13 08:28:53.654622 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/bin/kubeadm" Feb 13 08:28:53.654622 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubeadm: attempt #1 Feb 13 08:28:53.689454 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 08:28:53.829851 ignition[1057]: DEBUG : files: createFilesystemsFiles: createFiles: op(6): file matches expected sum of: 1c324cd645a7bf93d19d24c87498d9a17878eb1cc927e2680200ffeab2f85051ddec47d85b79b8e774042dc6726299ad3d7caf52c060701f00deba30dc33f660 Feb 13 08:28:53.829851 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/bin/kubeadm" Feb 13 08:28:53.871721 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/opt/bin/kubelet" Feb 13 08:28:53.871721 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubelet: attempt #1 Feb 13 08:28:53.902478 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(7): GET result: OK Feb 13 08:28:54.250920 ignition[1057]: DEBUG : files: createFilesystemsFiles: createFiles: op(7): file matches expected sum of: 40daf2a9b9e666c14b10e627da931bd79978628b1f23ef6429c1cb4fcba261f86ccff440c0dbb0070ee760fe55772b4fd279c4582dfbb17fa30bc94b7f00126b Feb 13 08:28:54.250920 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/opt/bin/kubelet" Feb 13 08:28:54.293631 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/opt/bin/kubectl" Feb 13 08:28:54.293631 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET https://dl.k8s.io/release/v1.26.5/bin/linux/amd64/kubectl: attempt #1 Feb 13 08:28:54.325681 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(8): GET result: OK Feb 13 08:28:54.498255 ignition[1057]: DEBUG : files: createFilesystemsFiles: createFiles: op(8): file matches expected sum of: 97840854134909d75a1a2563628cc4ba632067369ce7fc8a8a1e90a387d32dd7bfd73f4f5b5a82ef842088e7470692951eb7fc869c5f297dd740f855672ee628 Feb 13 08:28:54.498255 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/opt/bin/kubectl" Feb 13 08:28:54.548664 kernel: BTRFS info: devid 1 device path /dev/sdb6 changed to /dev/disk/by-label/OEM scanned by ignition (1079) Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/docker/daemon.json" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/docker/daemon.json" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/home/core/install.sh" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(d): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(d): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(e): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(e): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): oem config not found in "/usr/share/oem", looking on oem partition Feb 13 08:28:54.548753 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(10): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1052184803" Feb 13 08:28:54.548753 ignition[1057]: CRITICAL : files: createFilesystemsFiles: createFiles: op(f): op(10): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1052184803": device or resource busy Feb 13 08:28:54.878620 kernel: audit: type=1130 audit(1707812934.749:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.739544 systemd[1]: Finished ignition-files.service. Feb 13 08:28:54.888000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.895765 ignition[1057]: ERROR : files: createFilesystemsFiles: createFiles: op(f): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem1052184803", trying btrfs: device or resource busy Feb 13 08:28:54.895765 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1052184803" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem1052184803" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [started] unmounting "/mnt/oem1052184803" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [finished] unmounting "/mnt/oem1052184803" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/systemd/system/packet-phone-home.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(13): [started] processing unit "coreos-metadata-sshkeys@.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(13): [finished] processing unit "coreos-metadata-sshkeys@.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(14): [started] processing unit "packet-phone-home.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(14): [finished] processing unit "packet-phone-home.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(15): [started] processing unit "prepare-cni-plugins.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(15): op(16): [started] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(15): op(16): [finished] writing unit "prepare-cni-plugins.service" at "/sysroot/etc/systemd/system/prepare-cni-plugins.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(15): [finished] processing unit "prepare-cni-plugins.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(17): [started] processing unit "prepare-critools.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(17): op(18): [started] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 08:28:54.895765 ignition[1057]: INFO : files: op(17): op(18): [finished] writing unit "prepare-critools.service" at "/sysroot/etc/systemd/system/prepare-critools.service" Feb 13 08:28:54.903000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.903000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.972000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.080000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.208000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.755537 systemd[1]: Starting initrd-setup-root-after-ignition.service... Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(17): [finished] processing unit "prepare-critools.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(19): [started] processing unit "prepare-helm.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(19): op(1a): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(19): op(1a): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(19): [finished] processing unit "prepare-helm.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1b): [started] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1b): [finished] setting preset to enabled for "prepare-cni-plugins.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1c): [started] setting preset to enabled for "prepare-critools.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1c): [finished] setting preset to enabled for "prepare-critools.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1d): [started] setting preset to enabled for "prepare-helm.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1d): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1e): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1e): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1f): [started] setting preset to enabled for "packet-phone-home.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: op(1f): [finished] setting preset to enabled for "packet-phone-home.service" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: createResultFile: createFiles: op(20): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: createResultFile: createFiles: op(20): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 08:28:55.289733 ignition[1057]: INFO : files: files passed Feb 13 08:28:55.289733 ignition[1057]: INFO : POST message to Packet Timeline Feb 13 08:28:55.289733 ignition[1057]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:28:55.289733 ignition[1057]: INFO : GET result: OK Feb 13 08:28:55.289733 ignition[1057]: INFO : Ignition finished successfully Feb 13 08:28:55.476000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.542000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.651000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.675000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.691000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.737775 initrd-setup-root-after-ignition[1089]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 08:28:54.817670 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Feb 13 08:28:55.863235 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:28:55.863251 kernel: audit: type=1131 audit(1707812935.765:54): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.765000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.863289 iscsid[907]: iscsid shutting down. Feb 13 08:28:55.928562 kernel: audit: type=1131 audit(1707812935.870:55): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.817966 systemd[1]: Starting ignition-quench.service... Feb 13 08:28:55.935000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.858795 systemd[1]: Finished initrd-setup-root-after-ignition.service. Feb 13 08:28:56.058066 kernel: audit: type=1131 audit(1707812935.935:56): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.058152 kernel: audit: type=1131 audit(1707812936.001:57): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.001000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.889002 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 08:28:56.121882 kernel: audit: type=1131 audit(1707812936.065:58): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.065000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.889125 systemd[1]: Finished ignition-quench.service. Feb 13 08:28:54.903880 systemd[1]: Reached target ignition-complete.target. Feb 13 08:28:56.208624 kernel: audit: type=1131 audit(1707812936.141:59): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.141000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.934526 systemd[1]: Starting initrd-parse-etc.service... Feb 13 08:28:56.283664 kernel: audit: type=1131 audit(1707812936.216:60): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.283818 ignition[1106]: INFO : Ignition 2.14.0 Feb 13 08:28:56.283818 ignition[1106]: INFO : Stage: umount Feb 13 08:28:56.283818 ignition[1106]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Feb 13 08:28:56.283818 ignition[1106]: DEBUG : parsing config with SHA512: 0131bd505bfe1b1215ca4ec9809701a3323bf448114294874f7249d8d300440bd742a7532f60673bfa0746c04de0bd5ca68d0fe9a8ecd59464b13a6401323cb4 Feb 13 08:28:56.283818 ignition[1106]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Feb 13 08:28:56.283818 ignition[1106]: INFO : umount: umount passed Feb 13 08:28:56.283818 ignition[1106]: INFO : POST message to Packet Timeline Feb 13 08:28:56.283818 ignition[1106]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Feb 13 08:28:56.283818 ignition[1106]: INFO : GET result: OK Feb 13 08:28:56.283818 ignition[1106]: INFO : Ignition finished successfully Feb 13 08:28:56.579695 kernel: audit: type=1131 audit(1707812936.292:61): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.579719 kernel: audit: type=1131 audit(1707812936.359:62): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.579731 kernel: audit: type=1131 audit(1707812936.467:63): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.292000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.467000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.533000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:54.955948 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 08:28:54.956006 systemd[1]: Finished initrd-parse-etc.service. Feb 13 08:28:54.972828 systemd[1]: Reached target initrd-fs.target. Feb 13 08:28:56.627000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.001687 systemd[1]: Reached target initrd.target. Feb 13 08:28:56.642000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.642000 audit: BPF prog-id=6 op=UNLOAD Feb 13 08:28:55.023983 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Feb 13 08:28:55.026160 systemd[1]: Starting dracut-pre-pivot.service... Feb 13 08:28:55.059992 systemd[1]: Finished dracut-pre-pivot.service. Feb 13 08:28:56.689000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.082797 systemd[1]: Starting initrd-cleanup.service... Feb 13 08:28:56.704000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.116940 systemd[1]: Stopped target nss-lookup.target. Feb 13 08:28:56.719000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.135005 systemd[1]: Stopped target remote-cryptsetup.target. Feb 13 08:28:55.162059 systemd[1]: Stopped target timers.target. Feb 13 08:28:56.750000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.188000 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 08:28:55.188365 systemd[1]: Stopped dracut-pre-pivot.service. Feb 13 08:28:55.209314 systemd[1]: Stopped target initrd.target. Feb 13 08:28:56.799000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.228014 systemd[1]: Stopped target basic.target. Feb 13 08:28:56.813000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.253014 systemd[1]: Stopped target ignition-complete.target. Feb 13 08:28:56.828000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.280117 systemd[1]: Stopped target ignition-diskful.target. Feb 13 08:28:55.298004 systemd[1]: Stopped target initrd-root-device.target. Feb 13 08:28:56.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.318015 systemd[1]: Stopped target remote-fs.target. Feb 13 08:28:56.875000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.338000 systemd[1]: Stopped target remote-fs-pre.target. Feb 13 08:28:56.890000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.364032 systemd[1]: Stopped target sysinit.target. Feb 13 08:28:56.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:56.906000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.390025 systemd[1]: Stopped target local-fs.target. Feb 13 08:28:55.410010 systemd[1]: Stopped target local-fs-pre.target. Feb 13 08:28:55.432992 systemd[1]: Stopped target swap.target. Feb 13 08:28:55.454896 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 08:28:55.455259 systemd[1]: Stopped dracut-pre-mount.service. Feb 13 08:28:55.477352 systemd[1]: Stopped target cryptsetup.target. Feb 13 08:28:55.497906 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 08:28:55.498267 systemd[1]: Stopped dracut-initqueue.service. Feb 13 08:28:55.520165 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 08:28:55.520565 systemd[1]: Stopped ignition-fetch-offline.service. Feb 13 08:28:55.543347 systemd[1]: Stopped target paths.target. Feb 13 08:28:55.563873 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 08:28:57.005000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:55.569581 systemd[1]: Stopped systemd-ask-password-console.path. Feb 13 08:28:55.587127 systemd[1]: Stopped target slices.target. Feb 13 08:28:55.606979 systemd[1]: Stopped target sockets.target. Feb 13 08:28:55.628012 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 08:28:55.628397 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Feb 13 08:28:55.653110 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 08:28:55.653481 systemd[1]: Stopped ignition-files.service. Feb 13 08:28:55.677107 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 13 08:28:55.677502 systemd[1]: Stopped flatcar-metadata-hostname.service. Feb 13 08:28:55.695178 systemd[1]: Stopping ignition-mount.service... Feb 13 08:28:55.701674 systemd[1]: Stopping iscsid.service... Feb 13 08:28:55.729128 systemd[1]: Stopping sysroot-boot.service... Feb 13 08:28:55.744565 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 08:28:55.744731 systemd[1]: Stopped systemd-udev-trigger.service. Feb 13 08:28:55.767120 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 08:28:55.767553 systemd[1]: Stopped dracut-pre-trigger.service. Feb 13 08:28:55.873961 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 08:28:55.874543 systemd[1]: iscsid.service: Deactivated successfully. Feb 13 08:28:55.874618 systemd[1]: Stopped iscsid.service. Feb 13 08:28:55.936179 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 08:28:55.936264 systemd[1]: Stopped ignition-mount.service. Feb 13 08:28:56.002080 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 08:28:56.002172 systemd[1]: Stopped sysroot-boot.service. Feb 13 08:28:56.066092 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 08:28:56.066165 systemd[1]: Closed iscsid.socket. Feb 13 08:28:56.128660 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 08:28:56.128683 systemd[1]: Stopped ignition-disks.service. Feb 13 08:28:56.141678 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 08:28:56.141734 systemd[1]: Stopped ignition-kargs.service. Feb 13 08:28:56.216675 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 08:28:56.216736 systemd[1]: Stopped ignition-setup.service. Feb 13 08:28:56.292718 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 08:28:56.292795 systemd[1]: Stopped initrd-setup-root.service. Feb 13 08:28:56.359818 systemd[1]: Stopping iscsiuio.service... Feb 13 08:28:56.426799 systemd[1]: iscsiuio.service: Deactivated successfully. Feb 13 08:28:56.426845 systemd[1]: Stopped iscsiuio.service. Feb 13 08:28:57.078411 systemd-journald[267]: Received SIGTERM from PID 1 (n/a). Feb 13 08:28:56.467948 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 08:28:56.468077 systemd[1]: Finished initrd-cleanup.service. Feb 13 08:28:56.534004 systemd[1]: Stopped target network.target. Feb 13 08:28:56.542740 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 08:28:56.542768 systemd[1]: Closed iscsiuio.socket. Feb 13 08:28:56.556818 systemd[1]: Stopping systemd-networkd.service... Feb 13 08:28:56.564614 systemd-networkd[876]: enp1s0f0np0: DHCPv6 lease lost Feb 13 08:28:56.572589 systemd-networkd[876]: enp1s0f1np1: DHCPv6 lease lost Feb 13 08:28:56.594795 systemd[1]: Stopping systemd-resolved.service... Feb 13 08:28:57.077000 audit: BPF prog-id=9 op=UNLOAD Feb 13 08:28:56.611997 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 08:28:56.612135 systemd[1]: Stopped systemd-resolved.service. Feb 13 08:28:56.628583 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 08:28:56.628828 systemd[1]: Stopped systemd-networkd.service. Feb 13 08:28:56.642793 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 08:28:56.642812 systemd[1]: Closed systemd-networkd.socket. Feb 13 08:28:56.659060 systemd[1]: Stopping network-cleanup.service... Feb 13 08:28:56.669694 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 08:28:56.669732 systemd[1]: Stopped parse-ip-for-networkd.service. Feb 13 08:28:56.689699 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 08:28:56.689751 systemd[1]: Stopped systemd-sysctl.service. Feb 13 08:28:56.704987 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 08:28:56.705091 systemd[1]: Stopped systemd-modules-load.service. Feb 13 08:28:56.721116 systemd[1]: Stopping systemd-udevd.service... Feb 13 08:28:56.738542 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 08:28:56.740014 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 08:28:56.740377 systemd[1]: Stopped systemd-udevd.service. Feb 13 08:28:56.753155 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 08:28:56.753272 systemd[1]: Closed systemd-udevd-control.socket. Feb 13 08:28:56.766792 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 08:28:56.766890 systemd[1]: Closed systemd-udevd-kernel.socket. Feb 13 08:28:56.784678 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 08:28:56.784809 systemd[1]: Stopped dracut-pre-udev.service. Feb 13 08:28:56.799886 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 08:28:56.800023 systemd[1]: Stopped dracut-cmdline.service. Feb 13 08:28:56.814706 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 08:28:56.814822 systemd[1]: Stopped dracut-cmdline-ask.service. Feb 13 08:28:56.831358 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Feb 13 08:28:56.845474 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 08:28:56.845505 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service. Feb 13 08:28:56.860585 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 08:28:56.860617 systemd[1]: Stopped kmod-static-nodes.service. Feb 13 08:28:56.876534 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 08:28:56.876579 systemd[1]: Stopped systemd-vconsole-setup.service. Feb 13 08:28:56.892981 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Feb 13 08:28:56.893724 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 08:28:56.893842 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Feb 13 08:28:56.996312 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 08:28:56.996546 systemd[1]: Stopped network-cleanup.service. Feb 13 08:28:57.005866 systemd[1]: Reached target initrd-switch-root.target. Feb 13 08:28:57.025292 systemd[1]: Starting initrd-switch-root.service... Feb 13 08:28:57.036149 systemd[1]: Switching root. Feb 13 08:28:57.079414 systemd-journald[267]: Journal stopped Feb 13 08:29:00.782839 kernel: SELinux: Class mctp_socket not defined in policy. Feb 13 08:29:00.782865 kernel: SELinux: Class anon_inode not defined in policy. Feb 13 08:29:00.782874 kernel: SELinux: the above unknown classes and permissions will be allowed Feb 13 08:29:00.782880 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 08:29:00.782885 kernel: SELinux: policy capability open_perms=1 Feb 13 08:29:00.782890 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 08:29:00.782896 kernel: SELinux: policy capability always_check_network=0 Feb 13 08:29:00.782901 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 08:29:00.782906 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 08:29:00.782912 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 08:29:00.782918 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 08:29:00.782924 systemd[1]: Successfully loaded SELinux policy in 326.268ms. Feb 13 08:29:00.782930 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 5.716ms. Feb 13 08:29:00.782937 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 13 08:29:00.782945 systemd[1]: Detected architecture x86-64. Feb 13 08:29:00.782951 systemd[1]: Detected first boot. Feb 13 08:29:00.782956 systemd[1]: Hostname set to . Feb 13 08:29:00.782963 systemd[1]: Initializing machine ID from random generator. Feb 13 08:29:00.782968 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Feb 13 08:29:00.782974 systemd[1]: Populated /etc with preset unit settings. Feb 13 08:29:00.782980 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:29:00.782988 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:29:00.782995 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:29:00.783001 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 08:29:00.783007 systemd[1]: Stopped initrd-switch-root.service. Feb 13 08:29:00.783013 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 08:29:00.783019 systemd[1]: Created slice system-addon\x2dconfig.slice. Feb 13 08:29:00.783026 systemd[1]: Created slice system-addon\x2drun.slice. Feb 13 08:29:00.783033 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Feb 13 08:29:00.783039 systemd[1]: Created slice system-getty.slice. Feb 13 08:29:00.783045 systemd[1]: Created slice system-modprobe.slice. Feb 13 08:29:00.783051 systemd[1]: Created slice system-serial\x2dgetty.slice. Feb 13 08:29:00.783057 systemd[1]: Created slice system-system\x2dcloudinit.slice. Feb 13 08:29:00.783063 systemd[1]: Created slice system-systemd\x2dfsck.slice. Feb 13 08:29:00.783069 systemd[1]: Created slice user.slice. Feb 13 08:29:00.783075 systemd[1]: Started systemd-ask-password-console.path. Feb 13 08:29:00.783082 systemd[1]: Started systemd-ask-password-wall.path. Feb 13 08:29:00.783088 systemd[1]: Set up automount boot.automount. Feb 13 08:29:00.783094 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Feb 13 08:29:00.783100 systemd[1]: Stopped target initrd-switch-root.target. Feb 13 08:29:00.783107 systemd[1]: Stopped target initrd-fs.target. Feb 13 08:29:00.783114 systemd[1]: Stopped target initrd-root-fs.target. Feb 13 08:29:00.783120 systemd[1]: Reached target integritysetup.target. Feb 13 08:29:00.783126 systemd[1]: Reached target remote-cryptsetup.target. Feb 13 08:29:00.783134 systemd[1]: Reached target remote-fs.target. Feb 13 08:29:00.783140 systemd[1]: Reached target slices.target. Feb 13 08:29:00.783147 systemd[1]: Reached target swap.target. Feb 13 08:29:00.783153 systemd[1]: Reached target torcx.target. Feb 13 08:29:00.783159 systemd[1]: Reached target veritysetup.target. Feb 13 08:29:00.783165 systemd[1]: Listening on systemd-coredump.socket. Feb 13 08:29:00.783172 systemd[1]: Listening on systemd-initctl.socket. Feb 13 08:29:00.783178 systemd[1]: Listening on systemd-networkd.socket. Feb 13 08:29:00.783185 systemd[1]: Listening on systemd-udevd-control.socket. Feb 13 08:29:00.783192 systemd[1]: Listening on systemd-udevd-kernel.socket. Feb 13 08:29:00.783198 systemd[1]: Listening on systemd-userdbd.socket. Feb 13 08:29:00.783205 systemd[1]: Mounting dev-hugepages.mount... Feb 13 08:29:00.783211 systemd[1]: Mounting dev-mqueue.mount... Feb 13 08:29:00.783217 systemd[1]: Mounting media.mount... Feb 13 08:29:00.783225 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 08:29:00.783231 systemd[1]: Mounting sys-kernel-debug.mount... Feb 13 08:29:00.783237 systemd[1]: Mounting sys-kernel-tracing.mount... Feb 13 08:29:00.783244 systemd[1]: Mounting tmp.mount... Feb 13 08:29:00.783250 systemd[1]: Starting flatcar-tmpfiles.service... Feb 13 08:29:00.783256 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Feb 13 08:29:00.783263 systemd[1]: Starting kmod-static-nodes.service... Feb 13 08:29:00.783269 systemd[1]: Starting modprobe@configfs.service... Feb 13 08:29:00.783276 systemd[1]: Starting modprobe@dm_mod.service... Feb 13 08:29:00.783283 systemd[1]: Starting modprobe@drm.service... Feb 13 08:29:00.783290 systemd[1]: Starting modprobe@efi_pstore.service... Feb 13 08:29:00.783297 systemd[1]: Starting modprobe@fuse.service... Feb 13 08:29:00.783303 kernel: fuse: init (API version 7.34) Feb 13 08:29:00.783309 systemd[1]: Starting modprobe@loop.service... Feb 13 08:29:00.783315 kernel: loop: module loaded Feb 13 08:29:00.783321 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 08:29:00.783328 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 08:29:00.783335 systemd[1]: Stopped systemd-fsck-root.service. Feb 13 08:29:00.783342 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 08:29:00.783348 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 08:29:00.783355 systemd[1]: Stopped systemd-journald.service. Feb 13 08:29:00.783361 systemd[1]: Starting systemd-journald.service... Feb 13 08:29:00.783367 systemd[1]: Starting systemd-modules-load.service... Feb 13 08:29:00.783374 kernel: kauditd_printk_skb: 67 callbacks suppressed Feb 13 08:29:00.783380 kernel: audit: type=1305 audit(1707812940.779:124): op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 13 08:29:00.783388 systemd-journald[1257]: Journal started Feb 13 08:29:00.783415 systemd-journald[1257]: Runtime Journal (/run/log/journal/2204414790d44a3dacbadcf11bd25a97) is 8.0M, max 640.1M, 632.1M free. Feb 13 08:28:57.556000 audit: MAC_POLICY_LOAD auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 08:28:57.824000 audit[1]: AVC avc: denied { integrity } for pid=1 comm="systemd" lockdown_reason="/dev/mem,kmem,port" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 08:28:57.827000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 08:28:57.827000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Feb 13 08:28:57.827000 audit: BPF prog-id=10 op=LOAD Feb 13 08:28:57.827000 audit: BPF prog-id=10 op=UNLOAD Feb 13 08:28:57.827000 audit: BPF prog-id=11 op=LOAD Feb 13 08:28:57.827000 audit: BPF prog-id=11 op=UNLOAD Feb 13 08:28:57.894000 audit[1147]: AVC avc: denied { associate } for pid=1147 comm="torcx-generator" name="docker" dev="tmpfs" ino=2 scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 srawcon="system_u:object_r:container_file_t:s0:c1022,c1023" Feb 13 08:28:57.894000 audit[1147]: SYSCALL arch=c000003e syscall=188 success=yes exit=0 a0=c0001a58dc a1=c00002ce58 a2=c00002bb00 a3=32 items=0 ppid=1130 pid=1147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:57.894000 audit: PROCTITLE proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Feb 13 08:28:57.921000 audit[1147]: AVC avc: denied { associate } for pid=1147 comm="torcx-generator" name="lib" scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 Feb 13 08:28:57.921000 audit[1147]: SYSCALL arch=c000003e syscall=258 success=yes exit=0 a0=ffffffffffffff9c a1=c0001a59b5 a2=1ed a3=0 items=2 ppid=1130 pid=1147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:28:57.921000 audit: CWD cwd="/" Feb 13 08:28:57.921000 audit: PATH item=0 name=(null) inode=2 dev=00:1b mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:28:57.921000 audit: PATH item=1 name=(null) inode=3 dev=00:1b mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:28:57.921000 audit: PROCTITLE proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Feb 13 08:28:59.440000 audit: BPF prog-id=12 op=LOAD Feb 13 08:28:59.440000 audit: BPF prog-id=3 op=UNLOAD Feb 13 08:28:59.440000 audit: BPF prog-id=13 op=LOAD Feb 13 08:28:59.440000 audit: BPF prog-id=14 op=LOAD Feb 13 08:28:59.440000 audit: BPF prog-id=4 op=UNLOAD Feb 13 08:28:59.440000 audit: BPF prog-id=5 op=UNLOAD Feb 13 08:28:59.441000 audit: BPF prog-id=15 op=LOAD Feb 13 08:28:59.441000 audit: BPF prog-id=12 op=UNLOAD Feb 13 08:28:59.441000 audit: BPF prog-id=16 op=LOAD Feb 13 08:28:59.441000 audit: BPF prog-id=17 op=LOAD Feb 13 08:28:59.441000 audit: BPF prog-id=13 op=UNLOAD Feb 13 08:28:59.441000 audit: BPF prog-id=14 op=UNLOAD Feb 13 08:28:59.441000 audit: BPF prog-id=18 op=LOAD Feb 13 08:28:59.441000 audit: BPF prog-id=15 op=UNLOAD Feb 13 08:28:59.441000 audit: BPF prog-id=19 op=LOAD Feb 13 08:28:59.441000 audit: BPF prog-id=20 op=LOAD Feb 13 08:28:59.441000 audit: BPF prog-id=16 op=UNLOAD Feb 13 08:28:59.441000 audit: BPF prog-id=17 op=UNLOAD Feb 13 08:28:59.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:59.489000 audit: BPF prog-id=18 op=UNLOAD Feb 13 08:28:59.496000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:28:59.496000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=initrd-switch-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.734000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.755000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:00.755000 audit: BPF prog-id=21 op=LOAD Feb 13 08:29:00.755000 audit: BPF prog-id=22 op=LOAD Feb 13 08:29:00.755000 audit: BPF prog-id=23 op=LOAD Feb 13 08:29:00.755000 audit: BPF prog-id=19 op=UNLOAD Feb 13 08:29:00.755000 audit: BPF prog-id=20 op=UNLOAD Feb 13 08:29:00.779000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Feb 13 08:28:57.893462 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:28:59.439850 systemd[1]: Queued start job for default target multi-user.target. Feb 13 08:28:57.893907 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="profile found" name=docker-1.12-no path=/usr/share/torcx/profiles/docker-1.12-no.json Feb 13 08:28:59.443553 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 08:28:57.893921 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="profile found" name=vendor path=/usr/share/torcx/profiles/vendor.json Feb 13 08:28:57.893941 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=info msg="no vendor profile selected by /etc/flatcar/docker-1.12" Feb 13 08:28:57.893947 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="skipped missing lower profile" missing profile=oem Feb 13 08:28:57.893966 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=warning msg="no next profile: unable to read profile file: open /etc/torcx/next-profile: no such file or directory" Feb 13 08:28:57.893974 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="apply configuration parsed" lower profiles (vendor/oem)="[vendor]" upper profile (user)= Feb 13 08:28:57.894111 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="mounted tmpfs" target=/run/torcx/unpack Feb 13 08:28:57.894138 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="profile found" name=docker-1.12-no path=/usr/share/torcx/profiles/docker-1.12-no.json Feb 13 08:28:57.894147 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="profile found" name=vendor path=/usr/share/torcx/profiles/vendor.json Feb 13 08:28:57.894681 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="new archive/reference added to cache" format=tgz name=docker path="/usr/share/torcx/store/docker:20.10.torcx.tgz" reference=20.10 Feb 13 08:28:57.894707 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=debug msg="new archive/reference added to cache" format=tgz name=docker path="/usr/share/torcx/store/docker:com.coreos.cl.torcx.tgz" reference=com.coreos.cl Feb 13 08:28:57.894719 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=info msg="store skipped" err="open /usr/share/oem/torcx/store/3510.3.2: no such file or directory" path=/usr/share/oem/torcx/store/3510.3.2 Feb 13 08:28:57.894729 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=info msg="store skipped" err="open /usr/share/oem/torcx/store: no such file or directory" path=/usr/share/oem/torcx/store Feb 13 08:28:57.894740 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=info msg="store skipped" err="open /var/lib/torcx/store/3510.3.2: no such file or directory" path=/var/lib/torcx/store/3510.3.2 Feb 13 08:28:57.894749 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:57Z" level=info msg="store skipped" err="open /var/lib/torcx/store: no such file or directory" path=/var/lib/torcx/store Feb 13 08:28:59.091308 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:59Z" level=debug msg="image unpacked" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 08:28:59.091461 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:59Z" level=debug msg="binaries propagated" assets="[/bin/containerd /bin/containerd-shim /bin/ctr /bin/docker /bin/docker-containerd /bin/docker-containerd-shim /bin/docker-init /bin/docker-proxy /bin/docker-runc /bin/dockerd /bin/runc /bin/tini]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 08:28:59.091519 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:59Z" level=debug msg="networkd units propagated" assets="[/lib/systemd/network/50-docker.network /lib/systemd/network/90-docker-veth.network]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 08:28:59.091873 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:59Z" level=debug msg="systemd units propagated" assets="[/lib/systemd/system/containerd.service /lib/systemd/system/docker.service /lib/systemd/system/docker.socket /lib/systemd/system/sockets.target.wants /lib/systemd/system/multi-user.target.wants]" image=docker path=/run/torcx/unpack/docker reference=com.coreos.cl Feb 13 08:28:59.091912 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:59Z" level=debug msg="profile applied" sealed profile=/run/torcx/profile.json upper profile= Feb 13 08:28:59.091950 /usr/lib/systemd/system-generators/torcx-generator[1147]: time="2024-02-13T08:28:59Z" level=debug msg="system state sealed" content="[TORCX_LOWER_PROFILES=\"vendor\" TORCX_UPPER_PROFILE=\"\" TORCX_PROFILE_PATH=\"/run/torcx/profile.json\" TORCX_BINDIR=\"/run/torcx/bin\" TORCX_UNPACKDIR=\"/run/torcx/unpack\"]" path=/run/metadata/torcx Feb 13 08:29:00.779000 audit[1257]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7ffc093db500 a2=4000 a3=7ffc093db59c items=0 ppid=1 pid=1257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:00.779000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Feb 13 08:29:00.835411 kernel: audit: type=1300 audit(1707812940.779:124): arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7ffc093db500 a2=4000 a3=7ffc093db59c items=0 ppid=1 pid=1257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:00.835444 kernel: audit: type=1327 audit(1707812940.779:124): proctitle="/usr/lib/systemd/systemd-journald" Feb 13 08:29:00.937414 systemd[1]: Starting systemd-network-generator.service... Feb 13 08:29:00.962601 systemd[1]: Starting systemd-remount-fs.service... Feb 13 08:29:00.985451 systemd[1]: Starting systemd-udev-trigger.service... Feb 13 08:29:01.022890 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 08:29:01.022917 systemd[1]: Stopped verity-setup.service. Feb 13 08:29:01.029000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.063449 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 08:29:01.063483 kernel: audit: type=1131 audit(1707812941.029:125): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.107455 systemd[1]: Started systemd-journald.service. Feb 13 08:29:01.132000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.133047 systemd[1]: Mounted dev-hugepages.mount. Feb 13 08:29:01.179413 kernel: audit: type=1130 audit(1707812941.132:126): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.186757 systemd[1]: Mounted dev-mqueue.mount. Feb 13 08:29:01.193676 systemd[1]: Mounted media.mount. Feb 13 08:29:01.200676 systemd[1]: Mounted sys-kernel-debug.mount. Feb 13 08:29:01.209678 systemd[1]: Mounted sys-kernel-tracing.mount. Feb 13 08:29:01.218654 systemd[1]: Mounted tmp.mount. Feb 13 08:29:01.225736 systemd[1]: Finished flatcar-tmpfiles.service. Feb 13 08:29:01.234000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.234757 systemd[1]: Finished kmod-static-nodes.service. Feb 13 08:29:01.283598 kernel: audit: type=1130 audit(1707812941.234:127): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.291000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.291735 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 08:29:01.291817 systemd[1]: Finished modprobe@configfs.service. Feb 13 08:29:01.341413 kernel: audit: type=1130 audit(1707812941.291:128): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.349000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.349735 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 08:29:01.349810 systemd[1]: Finished modprobe@dm_mod.service. Feb 13 08:29:01.349000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.400448 kernel: audit: type=1130 audit(1707812941.349:129): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.400467 kernel: audit: type=1131 audit(1707812941.349:130): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.460736 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 08:29:01.460814 systemd[1]: Finished modprobe@drm.service. Feb 13 08:29:01.460000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.513456 kernel: audit: type=1130 audit(1707812941.460:131): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.521000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.521000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.521733 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 08:29:01.521792 systemd[1]: Finished modprobe@efi_pstore.service. Feb 13 08:29:01.530000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.530000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.530735 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 08:29:01.530793 systemd[1]: Finished modprobe@fuse.service. Feb 13 08:29:01.539000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.539000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.539730 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 08:29:01.539789 systemd[1]: Finished modprobe@loop.service. Feb 13 08:29:01.548000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.548000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.548752 systemd[1]: Finished systemd-modules-load.service. Feb 13 08:29:01.557000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.557713 systemd[1]: Finished systemd-network-generator.service. Feb 13 08:29:01.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.566717 systemd[1]: Finished systemd-remount-fs.service. Feb 13 08:29:01.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.574730 systemd[1]: Finished systemd-udev-trigger.service. Feb 13 08:29:01.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.585000 systemd[1]: Reached target network-pre.target. Feb 13 08:29:01.594901 systemd[1]: Mounting sys-fs-fuse-connections.mount... Feb 13 08:29:01.605319 systemd[1]: Mounting sys-kernel-config.mount... Feb 13 08:29:01.612687 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 08:29:01.616054 systemd[1]: Starting systemd-hwdb-update.service... Feb 13 08:29:01.624936 systemd[1]: Starting systemd-journal-flush.service... Feb 13 08:29:01.633699 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 08:29:01.634620 systemd[1]: Starting systemd-random-seed.service... Feb 13 08:29:01.634929 systemd-journald[1257]: Time spent on flushing to /var/log/journal/2204414790d44a3dacbadcf11bd25a97 is 16.994ms for 1630 entries. Feb 13 08:29:01.634929 systemd-journald[1257]: System Journal (/var/log/journal/2204414790d44a3dacbadcf11bd25a97) is 8.0M, max 195.6M, 187.6M free. Feb 13 08:29:01.688423 systemd-journald[1257]: Received client request to flush runtime journal. Feb 13 08:29:01.649600 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Feb 13 08:29:01.650963 systemd[1]: Starting systemd-sysctl.service... Feb 13 08:29:01.667266 systemd[1]: Starting systemd-sysusers.service... Feb 13 08:29:01.674245 systemd[1]: Starting systemd-udev-settle.service... Feb 13 08:29:01.681486 systemd[1]: Mounted sys-fs-fuse-connections.mount. Feb 13 08:29:01.690093 systemd[1]: Mounted sys-kernel-config.mount. Feb 13 08:29:01.699849 systemd[1]: Finished systemd-journal-flush.service. Feb 13 08:29:01.706000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.707750 systemd[1]: Finished systemd-random-seed.service. Feb 13 08:29:01.714000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.715662 systemd[1]: Finished systemd-sysctl.service. Feb 13 08:29:01.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.724624 systemd[1]: Finished systemd-sysusers.service. Feb 13 08:29:01.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.733587 systemd[1]: Reached target first-boot-complete.target. Feb 13 08:29:01.742108 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Feb 13 08:29:01.751604 udevadm[1272]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 08:29:01.760281 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Feb 13 08:29:01.767000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.954698 systemd[1]: Finished systemd-hwdb-update.service. Feb 13 08:29:01.963000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:01.963000 audit: BPF prog-id=24 op=LOAD Feb 13 08:29:01.963000 audit: BPF prog-id=25 op=LOAD Feb 13 08:29:01.963000 audit: BPF prog-id=7 op=UNLOAD Feb 13 08:29:01.963000 audit: BPF prog-id=8 op=UNLOAD Feb 13 08:29:01.964745 systemd[1]: Starting systemd-udevd.service... Feb 13 08:29:01.976593 systemd-udevd[1277]: Using default interface naming scheme 'v252'. Feb 13 08:29:01.995825 systemd[1]: Started systemd-udevd.service. Feb 13 08:29:02.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:02.006001 systemd[1]: Condition check resulted in dev-ttyS1.device being skipped. Feb 13 08:29:02.005000 audit: BPF prog-id=26 op=LOAD Feb 13 08:29:02.007353 systemd[1]: Starting systemd-networkd.service... Feb 13 08:29:02.034000 audit: BPF prog-id=27 op=LOAD Feb 13 08:29:02.056211 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Feb 13 08:29:02.056288 kernel: ACPI: button: Sleep Button [SLPB] Feb 13 08:29:02.056314 kernel: BTRFS info: devid 1 device path /dev/disk/by-label/OEM changed to /dev/sdb6 scanned by (udev-worker) (1342) Feb 13 08:29:02.056459 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 08:29:02.082000 audit: BPF prog-id=28 op=LOAD Feb 13 08:29:02.106000 audit: BPF prog-id=29 op=LOAD Feb 13 08:29:02.108294 systemd[1]: Starting systemd-userdbd.service... Feb 13 08:29:02.113414 kernel: IPMI message handler: version 39.2 Feb 13 08:29:02.113462 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 08:29:02.113486 kernel: ACPI: button: Power Button [PWRF] Feb 13 08:29:02.140609 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Feb 13 08:29:02.033000 audit[1334]: AVC avc: denied { confidentiality } for pid=1334 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Feb 13 08:29:02.033000 audit[1334]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=55ded1a38d30 a1=4d8bc a2=7f83664c4bc5 a3=5 items=42 ppid=1277 pid=1334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:02.033000 audit: CWD cwd="/" Feb 13 08:29:02.033000 audit: PATH item=0 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=1 name=(null) inode=16739 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=2 name=(null) inode=16739 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=3 name=(null) inode=16740 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=4 name=(null) inode=16739 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=5 name=(null) inode=16741 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=6 name=(null) inode=16739 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=7 name=(null) inode=16742 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=8 name=(null) inode=16742 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=9 name=(null) inode=16743 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=10 name=(null) inode=16742 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=11 name=(null) inode=16744 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=12 name=(null) inode=16742 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=13 name=(null) inode=16745 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=14 name=(null) inode=16742 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=15 name=(null) inode=16746 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=16 name=(null) inode=16742 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=17 name=(null) inode=16747 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=18 name=(null) inode=16739 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=19 name=(null) inode=16748 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=20 name=(null) inode=16748 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=21 name=(null) inode=16749 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=22 name=(null) inode=16748 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=23 name=(null) inode=16750 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=24 name=(null) inode=16748 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=25 name=(null) inode=16751 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=26 name=(null) inode=16748 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=27 name=(null) inode=16752 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=28 name=(null) inode=16748 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=29 name=(null) inode=16753 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=30 name=(null) inode=16739 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=31 name=(null) inode=16754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=32 name=(null) inode=16754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=33 name=(null) inode=16755 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=34 name=(null) inode=16754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=35 name=(null) inode=16756 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=36 name=(null) inode=16754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=37 name=(null) inode=16757 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=38 name=(null) inode=16754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=39 name=(null) inode=16758 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=40 name=(null) inode=16754 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PATH item=41 name=(null) inode=16759 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Feb 13 08:29:02.033000 audit: PROCTITLE proctitle="(udev-worker)" Feb 13 08:29:02.192161 systemd[1]: Started systemd-userdbd.service. Feb 13 08:29:02.209416 kernel: ipmi device interface Feb 13 08:29:02.216000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:02.235441 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Feb 13 08:29:02.236045 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Feb 13 08:29:02.236584 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Feb 13 08:29:02.237044 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Feb 13 08:29:02.239433 kernel: i2c i2c-0: 2/4 memory slots populated (from DMI) Feb 13 08:29:02.249430 kernel: ipmi_si: IPMI System Interface driver Feb 13 08:29:02.249505 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Feb 13 08:29:02.249746 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Feb 13 08:29:02.249794 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Feb 13 08:29:02.249830 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Feb 13 08:29:02.250062 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Feb 13 08:29:02.483416 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Feb 13 08:29:02.528779 kernel: ipmi_si: Adding ACPI-specified kcs state machine Feb 13 08:29:02.528804 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Feb 13 08:29:02.575417 kernel: iTCO_vendor_support: vendor-support=0 Feb 13 08:29:02.591084 systemd-networkd[1312]: bond0: netdev ready Feb 13 08:29:02.593483 systemd-networkd[1312]: lo: Link UP Feb 13 08:29:02.593486 systemd-networkd[1312]: lo: Gained carrier Feb 13 08:29:02.594023 systemd-networkd[1312]: Enumeration completed Feb 13 08:29:02.594092 systemd[1]: Started systemd-networkd.service. Feb 13 08:29:02.594345 systemd-networkd[1312]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Feb 13 08:29:02.595026 systemd-networkd[1312]: enp1s0f1np1: Configuring with /etc/systemd/network/10-1c:34:da:42:74:e9.network. Feb 13 08:29:02.646195 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Feb 13 08:29:02.646294 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Feb 13 08:29:02.646385 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Feb 13 08:29:02.669411 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Feb 13 08:29:02.704000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:02.742666 kernel: intel_rapl_common: Found RAPL domain package Feb 13 08:29:02.742707 kernel: intel_rapl_common: Found RAPL domain core Feb 13 08:29:02.742743 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Feb 13 08:29:02.742839 kernel: intel_rapl_common: Found RAPL domain dram Feb 13 08:29:02.814417 kernel: ipmi_ssif: IPMI SSIF Interface driver Feb 13 08:29:02.918471 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 08:29:02.948066 systemd-networkd[1312]: enp1s0f0np0: Configuring with /etc/systemd/network/10-1c:34:da:42:74:e8.network. Feb 13 08:29:02.948437 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Feb 13 08:29:03.034486 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:29:03.132470 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Feb 13 08:29:03.160424 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Feb 13 08:29:03.160523 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:29:03.183418 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): bond0: link becomes ready Feb 13 08:29:03.205678 systemd[1]: Finished systemd-udev-settle.service. Feb 13 08:29:03.207961 systemd-networkd[1312]: bond0: Link UP Feb 13 08:29:03.208152 systemd-networkd[1312]: enp1s0f1np1: Link UP Feb 13 08:29:03.208270 systemd-networkd[1312]: enp1s0f1np1: Gained carrier Feb 13 08:29:03.209244 systemd-networkd[1312]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-1c:34:da:42:74:e8.network. Feb 13 08:29:03.213000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:03.215181 systemd[1]: Starting lvm2-activation-early.service... Feb 13 08:29:03.232010 lvm[1381]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 08:29:03.264756 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Feb 13 08:29:03.264808 kernel: bond0: active interface up! Feb 13 08:29:03.285852 systemd[1]: Finished lvm2-activation-early.service. Feb 13 08:29:03.302000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:03.303550 systemd[1]: Reached target cryptsetup.target. Feb 13 08:29:03.313466 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:29:03.313514 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Feb 13 08:29:03.343060 systemd[1]: Starting lvm2-activation.service... Feb 13 08:29:03.345220 lvm[1382]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 08:29:03.391465 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.391861 systemd[1]: Finished lvm2-activation.service. Feb 13 08:29:03.407000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:03.408549 systemd[1]: Reached target local-fs-pre.target. Feb 13 08:29:03.416450 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.433538 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 08:29:03.433557 systemd[1]: Reached target local-fs.target. Feb 13 08:29:03.439451 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.455516 systemd[1]: Reached target machines.target. Feb 13 08:29:03.463472 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.480102 systemd[1]: Starting ldconfig.service... Feb 13 08:29:03.485413 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.502513 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Feb 13 08:29:03.502533 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 08:29:03.503037 systemd[1]: Starting systemd-boot-update.service... Feb 13 08:29:03.506411 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.527413 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.548466 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.558707 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Feb 13 08:29:03.568411 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.588015 systemd[1]: Starting systemd-machine-id-commit.service... Feb 13 08:29:03.589081 systemd[1]: systemd-sysext.service was skipped because no trigger condition checks were met. Feb 13 08:29:03.589105 systemd[1]: ensure-sysext.service was skipped because no trigger condition checks were met. Feb 13 08:29:03.589411 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.589617 systemd[1]: Starting systemd-tmpfiles-setup.service... Feb 13 08:29:03.589810 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1384 (bootctl) Feb 13 08:29:03.590366 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Feb 13 08:29:03.602461 systemd-tmpfiles[1388]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 13 08:29:03.604461 systemd-tmpfiles[1388]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 08:29:03.606608 systemd-tmpfiles[1388]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 08:29:03.610411 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.632460 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.652412 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.673450 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.690811 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Feb 13 08:29:03.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:03.694455 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.694836 systemd-networkd[1312]: enp1s0f0np0: Link UP Feb 13 08:29:03.695012 systemd-networkd[1312]: bond0: Gained carrier Feb 13 08:29:03.695099 systemd-networkd[1312]: enp1s0f0np0: Gained carrier Feb 13 08:29:03.728891 kernel: bond0: (slave enp1s0f1np1): link status down for interface, disabling it in 200 ms Feb 13 08:29:03.728914 kernel: bond0: (slave enp1s0f1np1): invalid new link 1 on slave Feb 13 08:29:03.747215 systemd-networkd[1312]: enp1s0f1np1: Link DOWN Feb 13 08:29:03.747218 systemd-networkd[1312]: enp1s0f1np1: Lost carrier Feb 13 08:29:03.747412 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 10000 Mbps full duplex Feb 13 08:29:03.899416 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Feb 13 08:29:03.917442 kernel: bond0: (slave enp1s0f1np1): speed changed to 0 on port 1 Feb 13 08:29:03.918328 systemd-networkd[1312]: enp1s0f1np1: Link UP Feb 13 08:29:03.918527 systemd-networkd[1312]: enp1s0f1np1: Gained carrier Feb 13 08:29:03.971430 kernel: bond0: (slave enp1s0f1np1): link status up again after 200 ms Feb 13 08:29:03.989412 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 10000 Mbps full duplex Feb 13 08:29:04.089751 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 08:29:04.090179 systemd[1]: Finished systemd-machine-id-commit.service. Feb 13 08:29:04.088000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:04.113246 systemd-fsck[1392]: fsck.fat 4.2 (2021-01-31) Feb 13 08:29:04.113246 systemd-fsck[1392]: /dev/sdb1: 789 files, 115339/258078 clusters Feb 13 08:29:04.114415 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Feb 13 08:29:04.122000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:04.124302 systemd[1]: Mounting boot.mount... Feb 13 08:29:04.135501 systemd[1]: Mounted boot.mount. Feb 13 08:29:04.154147 systemd[1]: Finished systemd-boot-update.service. Feb 13 08:29:04.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:04.183579 systemd[1]: Finished systemd-tmpfiles-setup.service. Feb 13 08:29:04.190000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:04.192226 systemd[1]: Starting audit-rules.service... Feb 13 08:29:04.199019 systemd[1]: Starting clean-ca-certificates.service... Feb 13 08:29:04.207993 systemd[1]: Starting systemd-journal-catalog-update.service... Feb 13 08:29:04.213000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Feb 13 08:29:04.213000 audit[1413]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffe32b83d90 a2=420 a3=0 items=0 ppid=1396 pid=1413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:04.213000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Feb 13 08:29:04.215042 augenrules[1413]: No rules Feb 13 08:29:04.218399 systemd[1]: Starting systemd-resolved.service... Feb 13 08:29:04.226436 systemd[1]: Starting systemd-timesyncd.service... Feb 13 08:29:04.233985 systemd[1]: Starting systemd-update-utmp.service... Feb 13 08:29:04.240763 systemd[1]: Finished audit-rules.service. Feb 13 08:29:04.247623 systemd[1]: Finished clean-ca-certificates.service. Feb 13 08:29:04.255614 systemd[1]: Finished systemd-journal-catalog-update.service. Feb 13 08:29:04.257691 ldconfig[1383]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 08:29:04.265666 systemd[1]: Finished ldconfig.service. Feb 13 08:29:04.274502 systemd[1]: Starting systemd-update-done.service... Feb 13 08:29:04.281492 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 08:29:04.281869 systemd[1]: Finished systemd-update-done.service. Feb 13 08:29:04.292205 systemd[1]: Finished systemd-update-utmp.service. Feb 13 08:29:04.301122 systemd-resolved[1418]: Positive Trust Anchors: Feb 13 08:29:04.301126 systemd-resolved[1418]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 08:29:04.301145 systemd-resolved[1418]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Feb 13 08:29:04.301503 systemd[1]: Started systemd-timesyncd.service. Feb 13 08:29:04.304700 systemd-resolved[1418]: Using system hostname 'ci-3510.3.2-a-5e41ede811'. Feb 13 08:29:04.310699 systemd[1]: Started systemd-resolved.service. Feb 13 08:29:04.318539 systemd[1]: Reached target network.target. Feb 13 08:29:04.326506 systemd[1]: Reached target nss-lookup.target. Feb 13 08:29:04.334503 systemd[1]: Reached target sysinit.target. Feb 13 08:29:04.343625 systemd[1]: Started motdgen.path. Feb 13 08:29:04.350514 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Feb 13 08:29:04.360591 systemd[1]: Started systemd-tmpfiles-clean.timer. Feb 13 08:29:04.368548 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 08:29:04.368569 systemd[1]: Reached target paths.target. Feb 13 08:29:04.375494 systemd[1]: Reached target time-set.target. Feb 13 08:29:04.383663 systemd[1]: Started logrotate.timer. Feb 13 08:29:04.390670 systemd[1]: Started mdadm.timer. Feb 13 08:29:04.397564 systemd[1]: Reached target timers.target. Feb 13 08:29:04.405077 systemd[1]: Listening on dbus.socket. Feb 13 08:29:04.411992 systemd[1]: Starting docker.socket... Feb 13 08:29:04.419827 systemd[1]: Listening on sshd.socket. Feb 13 08:29:04.426653 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 08:29:04.426866 systemd[1]: Listening on docker.socket. Feb 13 08:29:04.433869 systemd[1]: Reached target sockets.target. Feb 13 08:29:04.442448 systemd[1]: Reached target basic.target. Feb 13 08:29:04.449467 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 08:29:04.449481 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Feb 13 08:29:04.449926 systemd[1]: Starting containerd.service... Feb 13 08:29:04.456847 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Feb 13 08:29:04.464890 systemd[1]: Starting coreos-metadata.service... Feb 13 08:29:04.471934 systemd[1]: Starting dbus.service... Feb 13 08:29:04.477928 systemd[1]: Starting enable-oem-cloudinit.service... Feb 13 08:29:04.482816 jq[1433]: false Feb 13 08:29:04.485268 systemd[1]: Starting extend-filesystems.service... Feb 13 08:29:04.487578 coreos-metadata[1426]: Feb 13 08:29:04.487 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:29:04.492514 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Feb 13 08:29:04.492717 dbus-daemon[1432]: [system] SELinux support is enabled Feb 13 08:29:04.493236 systemd[1]: Starting motdgen.service... Feb 13 08:29:04.493541 extend-filesystems[1434]: Found sda Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb1 Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb2 Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb3 Feb 13 08:29:04.512629 extend-filesystems[1434]: Found usr Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb4 Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb6 Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb7 Feb 13 08:29:04.512629 extend-filesystems[1434]: Found sdb9 Feb 13 08:29:04.512629 extend-filesystems[1434]: Checking size of /dev/sdb9 Feb 13 08:29:04.512629 extend-filesystems[1434]: Resized partition /dev/sdb9 Feb 13 08:29:04.675598 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Feb 13 08:29:04.675657 coreos-metadata[1429]: Feb 13 08:29:04.495 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Feb 13 08:29:04.500321 systemd[1]: Starting prepare-cni-plugins.service... Feb 13 08:29:04.675811 extend-filesystems[1450]: resize2fs 1.46.5 (30-Dec-2021) Feb 13 08:29:04.530145 systemd[1]: Starting prepare-critools.service... Feb 13 08:29:04.686676 dbus-daemon[1432]: [system] Successfully activated service 'org.freedesktop.systemd1' Feb 13 08:29:04.537074 systemd[1]: Starting prepare-helm.service... Feb 13 08:29:04.556916 systemd[1]: Starting ssh-key-proc-cmdline.service... Feb 13 08:29:04.570899 systemd[1]: Starting sshd-keygen.service... Feb 13 08:29:04.586745 systemd[1]: Starting systemd-logind.service... Feb 13 08:29:04.599453 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 13 08:29:04.599951 systemd[1]: Starting tcsd.service... Feb 13 08:29:04.692080 update_engine[1465]: I0213 08:29:04.660346 1465 main.cc:92] Flatcar Update Engine starting Feb 13 08:29:04.692080 update_engine[1465]: I0213 08:29:04.663795 1465 update_check_scheduler.cc:74] Next update check in 4m50s Feb 13 08:29:04.609088 systemd-logind[1463]: Watching system buttons on /dev/input/event3 (Power Button) Feb 13 08:29:04.692337 jq[1466]: true Feb 13 08:29:04.609100 systemd-logind[1463]: Watching system buttons on /dev/input/event2 (Sleep Button) Feb 13 08:29:04.609113 systemd-logind[1463]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Feb 13 08:29:04.692534 tar[1468]: ./ Feb 13 08:29:04.692534 tar[1468]: ./macvlan Feb 13 08:29:04.692534 tar[1468]: ./static Feb 13 08:29:04.609256 systemd-logind[1463]: New seat seat0. Feb 13 08:29:04.692773 tar[1469]: crictl Feb 13 08:29:04.611798 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 08:29:04.692969 tar[1470]: linux-amd64/helm Feb 13 08:29:04.612151 systemd[1]: Starting update-engine.service... Feb 13 08:29:04.693160 jq[1474]: true Feb 13 08:29:04.626951 systemd[1]: Starting update-ssh-keys-after-ignition.service... Feb 13 08:29:04.642836 systemd[1]: Started dbus.service. Feb 13 08:29:04.651136 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 08:29:04.651220 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Feb 13 08:29:04.651364 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 08:29:04.651444 systemd[1]: Finished motdgen.service. Feb 13 08:29:04.662507 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 08:29:04.662592 systemd[1]: Finished ssh-key-proc-cmdline.service. Feb 13 08:29:04.692101 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Feb 13 08:29:04.692202 systemd[1]: Condition check resulted in tcsd.service being skipped. Feb 13 08:29:04.694177 systemd[1]: Started systemd-logind.service. Feb 13 08:29:04.703349 tar[1468]: ./vlan Feb 13 08:29:04.704563 systemd[1]: Started update-engine.service. Feb 13 08:29:04.704814 env[1475]: time="2024-02-13T08:29:04.704791801Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Feb 13 08:29:04.713413 env[1475]: time="2024-02-13T08:29:04.713390360Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 08:29:04.713569 env[1475]: time="2024-02-13T08:29:04.713557569Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:29:04.713696 bash[1498]: Updated "/home/core/.ssh/authorized_keys" Feb 13 08:29:04.714157 systemd[1]: Started locksmithd.service. Feb 13 08:29:04.714306 env[1475]: time="2024-02-13T08:29:04.714286804Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.148-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714337 env[1475]: time="2024-02-13T08:29:04.714307318Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714510 env[1475]: time="2024-02-13T08:29:04.714494074Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714549 env[1475]: time="2024-02-13T08:29:04.714510853Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714549 env[1475]: time="2024-02-13T08:29:04.714523228Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Feb 13 08:29:04.714549 env[1475]: time="2024-02-13T08:29:04.714532828Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714619 env[1475]: time="2024-02-13T08:29:04.714592761Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714772 env[1475]: time="2024-02-13T08:29:04.714760210Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714870 env[1475]: time="2024-02-13T08:29:04.714856721Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 08:29:04.714900 env[1475]: time="2024-02-13T08:29:04.714871688Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 08:29:04.714925 env[1475]: time="2024-02-13T08:29:04.714910959Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Feb 13 08:29:04.714953 env[1475]: time="2024-02-13T08:29:04.714923682Z" level=info msg="metadata content store policy set" policy=shared Feb 13 08:29:04.720560 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 08:29:04.720645 systemd[1]: Reached target system-config.target. Feb 13 08:29:04.722768 env[1475]: time="2024-02-13T08:29:04.722750309Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 08:29:04.722811 env[1475]: time="2024-02-13T08:29:04.722770134Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 08:29:04.722811 env[1475]: time="2024-02-13T08:29:04.722779045Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 08:29:04.722811 env[1475]: time="2024-02-13T08:29:04.722795119Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722811 env[1475]: time="2024-02-13T08:29:04.722803269Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722811 env[1475]: time="2024-02-13T08:29:04.722810958Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722818293Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722826653Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722834484Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722841961Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722848972Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722855780Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 08:29:04.722960 env[1475]: time="2024-02-13T08:29:04.722908095Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 08:29:04.723121 env[1475]: time="2024-02-13T08:29:04.722965830Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 08:29:04.723121 env[1475]: time="2024-02-13T08:29:04.723103532Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 08:29:04.723170 env[1475]: time="2024-02-13T08:29:04.723125062Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723170 env[1475]: time="2024-02-13T08:29:04.723133260Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 08:29:04.723170 env[1475]: time="2024-02-13T08:29:04.723162841Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723172775Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723187355Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723197919Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723204890Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723211291Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723220493Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723243 env[1475]: time="2024-02-13T08:29:04.723232851Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723405 env[1475]: time="2024-02-13T08:29:04.723244871Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 08:29:04.723405 env[1475]: time="2024-02-13T08:29:04.723309012Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723405 env[1475]: time="2024-02-13T08:29:04.723389603Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723405 env[1475]: time="2024-02-13T08:29:04.723396670Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723405 env[1475]: time="2024-02-13T08:29:04.723402794Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 08:29:04.723530 env[1475]: time="2024-02-13T08:29:04.723415136Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Feb 13 08:29:04.723530 env[1475]: time="2024-02-13T08:29:04.723422017Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 08:29:04.723530 env[1475]: time="2024-02-13T08:29:04.723431516Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Feb 13 08:29:04.723530 env[1475]: time="2024-02-13T08:29:04.723452083Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 08:29:04.723631 env[1475]: time="2024-02-13T08:29:04.723569571Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 08:29:04.723631 env[1475]: time="2024-02-13T08:29:04.723600704Z" level=info msg="Connect containerd service" Feb 13 08:29:04.723631 env[1475]: time="2024-02-13T08:29:04.723620440Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.723905368Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724016710Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724017364Z" level=info msg="Start subscribing containerd event" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724038371Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724046399Z" level=info msg="Start recovering state" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724061374Z" level=info msg="containerd successfully booted in 0.025880s" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724079683Z" level=info msg="Start event monitor" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724087822Z" level=info msg="Start snapshots syncer" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724093458Z" level=info msg="Start cni network conf syncer for default" Feb 13 08:29:04.725677 env[1475]: time="2024-02-13T08:29:04.724097874Z" level=info msg="Start streaming server" Feb 13 08:29:04.728555 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 08:29:04.728647 systemd[1]: Reached target user-config.target. Feb 13 08:29:04.734108 tar[1468]: ./portmap Feb 13 08:29:04.738081 systemd[1]: Started containerd.service. Feb 13 08:29:04.744712 systemd[1]: Finished update-ssh-keys-after-ignition.service. Feb 13 08:29:04.753257 tar[1468]: ./host-local Feb 13 08:29:04.770077 tar[1468]: ./vrf Feb 13 08:29:04.777421 locksmithd[1508]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 08:29:04.788293 tar[1468]: ./bridge Feb 13 08:29:04.810168 tar[1468]: ./tuning Feb 13 08:29:04.827732 tar[1468]: ./firewall Feb 13 08:29:04.850167 tar[1468]: ./host-device Feb 13 08:29:04.869677 tar[1468]: ./sbr Feb 13 08:29:04.887621 tar[1468]: ./loopback Feb 13 08:29:04.904603 tar[1468]: ./dhcp Feb 13 08:29:04.930018 sshd_keygen[1462]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 08:29:04.941668 tar[1470]: linux-amd64/LICENSE Feb 13 08:29:04.941747 tar[1470]: linux-amd64/README.md Feb 13 08:29:04.942690 systemd[1]: Finished sshd-keygen.service. Feb 13 08:29:04.951226 systemd[1]: Finished prepare-helm.service. Feb 13 08:29:04.954035 tar[1468]: ./ptp Feb 13 08:29:04.960742 systemd[1]: Finished prepare-critools.service. Feb 13 08:29:04.970344 systemd[1]: Starting issuegen.service... Feb 13 08:29:04.975154 tar[1468]: ./ipvlan Feb 13 08:29:04.977675 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 08:29:04.977763 systemd[1]: Finished issuegen.service. Feb 13 08:29:04.986319 systemd[1]: Starting systemd-user-sessions.service... Feb 13 08:29:04.994679 systemd[1]: Finished systemd-user-sessions.service. Feb 13 08:29:04.995588 tar[1468]: ./bandwidth Feb 13 08:29:05.003491 systemd-networkd[1312]: bond0: Gained IPv6LL Feb 13 08:29:05.003661 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Feb 13 08:29:05.004297 systemd[1]: Started getty@tty1.service. Feb 13 08:29:05.021230 systemd[1]: Started serial-getty@ttyS1.service. Feb 13 08:29:05.049860 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Feb 13 08:29:05.029594 systemd[1]: Reached target getty.target. Feb 13 08:29:05.049962 extend-filesystems[1450]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Feb 13 08:29:05.049962 extend-filesystems[1450]: old_desc_blocks = 1, new_desc_blocks = 56 Feb 13 08:29:05.049962 extend-filesystems[1450]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Feb 13 08:29:05.075544 extend-filesystems[1434]: Resized filesystem in /dev/sdb9 Feb 13 08:29:05.050592 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 08:29:05.050674 systemd[1]: Finished extend-filesystems.service. Feb 13 08:29:05.060862 systemd[1]: Finished prepare-cni-plugins.service. Feb 13 08:29:05.322669 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Feb 13 08:29:05.322768 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Feb 13 08:29:06.125630 kernel: mlx5_core 0000:01:00.0: lag map port 1:1 port 2:2 shared_fdb:0 Feb 13 08:29:10.180003 login[1532]: pam_lastlog(login:session): file /var/log/lastlog is locked/write Feb 13 08:29:10.186490 login[1533]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 08:29:10.216160 systemd-logind[1463]: New session 2 of user core. Feb 13 08:29:10.218839 systemd[1]: Created slice user-500.slice. Feb 13 08:29:10.221695 systemd[1]: Starting user-runtime-dir@500.service... Feb 13 08:29:10.232304 systemd[1]: Finished user-runtime-dir@500.service. Feb 13 08:29:10.233011 systemd[1]: Starting user@500.service... Feb 13 08:29:10.235026 (systemd)[1540]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:10.309677 systemd[1540]: Queued start job for default target default.target. Feb 13 08:29:10.309909 systemd[1540]: Reached target paths.target. Feb 13 08:29:10.309921 systemd[1540]: Reached target sockets.target. Feb 13 08:29:10.309929 systemd[1540]: Reached target timers.target. Feb 13 08:29:10.309937 systemd[1540]: Reached target basic.target. Feb 13 08:29:10.309956 systemd[1540]: Reached target default.target. Feb 13 08:29:10.309970 systemd[1540]: Startup finished in 71ms. Feb 13 08:29:10.310013 systemd[1]: Started user@500.service. Feb 13 08:29:10.310597 systemd[1]: Started session-2.scope. Feb 13 08:29:10.635477 coreos-metadata[1426]: Feb 13 08:29:10.635 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 08:29:10.636216 coreos-metadata[1429]: Feb 13 08:29:10.635 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata): error trying to connect: dns error: failed to lookup address information: Name or service not known Feb 13 08:29:11.180951 login[1532]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 08:29:11.191918 systemd-logind[1463]: New session 1 of user core. Feb 13 08:29:11.193817 systemd[1]: Started session-1.scope. Feb 13 08:29:11.384837 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:2 port 2:2 Feb 13 08:29:11.385000 kernel: mlx5_core 0000:01:00.0: modify lag map port 1:1 port 2:2 Feb 13 08:29:11.635708 coreos-metadata[1426]: Feb 13 08:29:11.635 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 08:29:11.636507 coreos-metadata[1429]: Feb 13 08:29:11.635 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Feb 13 08:29:11.663804 coreos-metadata[1429]: Feb 13 08:29:11.663 INFO Fetch successful Feb 13 08:29:11.664100 coreos-metadata[1426]: Feb 13 08:29:11.664 INFO Fetch successful Feb 13 08:29:11.689549 systemd[1]: Finished coreos-metadata.service. Feb 13 08:29:11.690245 unknown[1426]: wrote ssh authorized keys file for user: core Feb 13 08:29:11.690357 systemd[1]: Started packet-phone-home.service. Feb 13 08:29:11.699928 curl[1562]: % Total % Received % Xferd Average Speed Time Time Time Current Feb 13 08:29:11.700083 curl[1562]: Dload Upload Total Spent Left Speed Feb 13 08:29:11.703045 update-ssh-keys[1563]: Updated "/home/core/.ssh/authorized_keys" Feb 13 08:29:11.703235 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Feb 13 08:29:11.703594 systemd[1]: Reached target multi-user.target. Feb 13 08:29:11.704239 systemd[1]: Starting systemd-update-utmp-runlevel.service... Feb 13 08:29:11.708060 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 13 08:29:11.708129 systemd[1]: Finished systemd-update-utmp-runlevel.service. Feb 13 08:29:11.708258 systemd[1]: Startup finished in 1.849s (kernel) + 19.376s (initrd) + 14.499s (userspace) = 35.725s. Feb 13 08:29:11.899026 curl[1562]: \u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\u000d 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 Feb 13 08:29:11.901567 systemd[1]: packet-phone-home.service: Deactivated successfully. Feb 13 08:29:11.978043 systemd[1]: Created slice system-sshd.slice. Feb 13 08:29:11.978695 systemd[1]: Started sshd@0-145.40.67.89:22-139.178.68.195:42222.service. Feb 13 08:29:12.024917 sshd[1567]: Accepted publickey for core from 139.178.68.195 port 42222 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.026421 sshd[1567]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.031359 systemd-logind[1463]: New session 3 of user core. Feb 13 08:29:12.033278 systemd[1]: Started session-3.scope. Feb 13 08:29:12.098987 systemd[1]: Started sshd@1-145.40.67.89:22-139.178.68.195:42226.service. Feb 13 08:29:12.136201 sshd[1572]: Accepted publickey for core from 139.178.68.195 port 42226 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.136871 sshd[1572]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.139040 systemd-logind[1463]: New session 4 of user core. Feb 13 08:29:12.139783 systemd[1]: Started session-4.scope. Feb 13 08:29:12.190863 sshd[1572]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:12.193951 systemd[1]: sshd@1-145.40.67.89:22-139.178.68.195:42226.service: Deactivated successfully. Feb 13 08:29:12.194668 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 08:29:12.195313 systemd-logind[1463]: Session 4 logged out. Waiting for processes to exit. Feb 13 08:29:12.196513 systemd[1]: Started sshd@2-145.40.67.89:22-139.178.68.195:42236.service. Feb 13 08:29:12.197575 systemd-logind[1463]: Removed session 4. Feb 13 08:29:12.241195 sshd[1578]: Accepted publickey for core from 139.178.68.195 port 42236 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.242163 sshd[1578]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.245142 systemd-logind[1463]: New session 5 of user core. Feb 13 08:29:12.246187 systemd[1]: Started session-5.scope. Feb 13 08:29:12.298986 sshd[1578]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:12.300900 systemd[1]: sshd@2-145.40.67.89:22-139.178.68.195:42236.service: Deactivated successfully. Feb 13 08:29:12.301270 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 08:29:12.301627 systemd-logind[1463]: Session 5 logged out. Waiting for processes to exit. Feb 13 08:29:12.302202 systemd[1]: Started sshd@3-145.40.67.89:22-139.178.68.195:42248.service. Feb 13 08:29:12.302690 systemd-logind[1463]: Removed session 5. Feb 13 08:29:12.336999 sshd[1584]: Accepted publickey for core from 139.178.68.195 port 42248 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.338170 sshd[1584]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.342087 systemd-logind[1463]: New session 6 of user core. Feb 13 08:29:12.343611 systemd[1]: Started session-6.scope. Feb 13 08:29:12.411819 sshd[1584]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:12.420188 systemd[1]: sshd@3-145.40.67.89:22-139.178.68.195:42248.service: Deactivated successfully. Feb 13 08:29:12.422302 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 08:29:12.424263 systemd-logind[1463]: Session 6 logged out. Waiting for processes to exit. Feb 13 08:29:12.427325 systemd[1]: Started sshd@4-145.40.67.89:22-139.178.68.195:42250.service. Feb 13 08:29:12.430055 systemd-logind[1463]: Removed session 6. Feb 13 08:29:12.499975 sshd[1590]: Accepted publickey for core from 139.178.68.195 port 42250 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.503038 sshd[1590]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.513253 systemd-logind[1463]: New session 7 of user core. Feb 13 08:29:12.517052 systemd[1]: Started session-7.scope. Feb 13 08:29:12.612768 sudo[1593]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 08:29:12.613352 sudo[1593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:29:12.640898 dbus-daemon[1432]: \xd0m%\x9e\xf7U: received setenforce notice (enforcing=544778976) Feb 13 08:29:12.645950 sudo[1593]: pam_unix(sudo:session): session closed for user root Feb 13 08:29:12.651238 sshd[1590]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:12.660015 systemd[1]: sshd@4-145.40.67.89:22-139.178.68.195:42250.service: Deactivated successfully. Feb 13 08:29:12.662192 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 08:29:12.664257 systemd-logind[1463]: Session 7 logged out. Waiting for processes to exit. Feb 13 08:29:12.667400 systemd[1]: Started sshd@5-145.40.67.89:22-139.178.68.195:42260.service. Feb 13 08:29:12.670164 systemd-logind[1463]: Removed session 7. Feb 13 08:29:12.734100 sshd[1597]: Accepted publickey for core from 139.178.68.195 port 42260 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.736364 sshd[1597]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.743753 systemd-logind[1463]: New session 8 of user core. Feb 13 08:29:12.746371 systemd[1]: Started session-8.scope. Feb 13 08:29:12.813204 sudo[1601]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 08:29:12.813314 sudo[1601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:29:12.815136 sudo[1601]: pam_unix(sudo:session): session closed for user root Feb 13 08:29:12.817298 sudo[1600]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Feb 13 08:29:12.817397 sudo[1600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:29:12.822506 systemd[1]: Stopping audit-rules.service... Feb 13 08:29:12.821000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 08:29:12.823259 auditctl[1604]: No rules Feb 13 08:29:12.823431 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 08:29:12.823518 systemd[1]: Stopped audit-rules.service. Feb 13 08:29:12.824382 systemd[1]: Starting audit-rules.service... Feb 13 08:29:12.828617 kernel: kauditd_printk_skb: 87 callbacks suppressed Feb 13 08:29:12.828653 kernel: audit: type=1305 audit(1707812952.821:172): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Feb 13 08:29:12.834221 augenrules[1621]: No rules Feb 13 08:29:12.834505 systemd[1]: Finished audit-rules.service. Feb 13 08:29:12.835028 sudo[1600]: pam_unix(sudo:session): session closed for user root Feb 13 08:29:12.835833 sshd[1597]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:12.837581 systemd[1]: sshd@5-145.40.67.89:22-139.178.68.195:42260.service: Deactivated successfully. Feb 13 08:29:12.837996 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 08:29:12.838370 systemd-logind[1463]: Session 8 logged out. Waiting for processes to exit. Feb 13 08:29:12.838987 systemd[1]: Started sshd@6-145.40.67.89:22-139.178.68.195:42274.service. Feb 13 08:29:12.839510 systemd-logind[1463]: Removed session 8. Feb 13 08:29:12.821000 audit[1604]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc81aafbe0 a2=420 a3=0 items=0 ppid=1 pid=1604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:12.844464 kernel: audit: type=1300 audit(1707812952.821:172): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc81aafbe0 a2=420 a3=0 items=0 ppid=1 pid=1604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:12.821000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Feb 13 08:29:12.884751 kernel: audit: type=1327 audit(1707812952.821:172): proctitle=2F7362696E2F617564697463746C002D44 Feb 13 08:29:12.884778 kernel: audit: type=1131 audit(1707812952.822:173): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.822000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.907232 kernel: audit: type=1130 audit(1707812952.833:174): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.913271 sshd[1627]: Accepted publickey for core from 139.178.68.195 port 42274 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:29:12.914834 sshd[1627]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:29:12.917011 systemd-logind[1463]: New session 9 of user core. Feb 13 08:29:12.917731 systemd[1]: Started session-9.scope. Feb 13 08:29:12.929716 kernel: audit: type=1106 audit(1707812952.833:175): pid=1600 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.833000 audit[1600]: USER_END pid=1600 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.955725 kernel: audit: type=1104 audit(1707812952.833:176): pid=1600 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.833000 audit[1600]: CRED_DISP pid=1600 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.965102 sudo[1630]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 08:29:12.965207 sudo[1630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Feb 13 08:29:12.834000 audit[1597]: USER_END pid=1597 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:13.011431 kernel: audit: type=1106 audit(1707812952.834:177): pid=1597 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:13.011458 kernel: audit: type=1104 audit(1707812952.835:178): pid=1597 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:12.835000 audit[1597]: CRED_DISP pid=1597 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:12.836000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-145.40.67.89:22-139.178.68.195:42260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:13.062777 kernel: audit: type=1131 audit(1707812952.836:179): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-145.40.67.89:22-139.178.68.195:42260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.837000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.67.89:22-139.178.68.195:42274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.912000 audit[1627]: USER_ACCT pid=1627 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:12.914000 audit[1627]: CRED_ACQ pid=1627 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:12.914000 audit[1627]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeea251bb0 a2=3 a3=0 items=0 ppid=1 pid=1627 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:12.914000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:29:12.918000 audit[1627]: USER_START pid=1627 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:12.919000 audit[1629]: CRED_ACQ pid=1629 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:12.963000 audit[1630]: USER_ACCT pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.963000 audit[1630]: CRED_REFR pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:12.964000 audit[1630]: USER_START pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:18.822173 systemd[1]: Starting systemd-networkd-wait-online.service... Feb 13 08:29:18.826627 systemd[1]: Finished systemd-networkd-wait-online.service. Feb 13 08:29:18.825000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:18.826837 systemd[1]: Reached target network-online.target. Feb 13 08:29:18.827574 systemd[1]: Starting docker.service... Feb 13 08:29:18.832190 kernel: kauditd_printk_skb: 11 callbacks suppressed Feb 13 08:29:18.832219 kernel: audit: type=1130 audit(1707812958.825:189): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:18.846137 env[1651]: time="2024-02-13T08:29:18.846080407Z" level=info msg="Starting up" Feb 13 08:29:18.846894 env[1651]: time="2024-02-13T08:29:18.846851382Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 08:29:18.846894 env[1651]: time="2024-02-13T08:29:18.846876505Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 08:29:18.846894 env[1651]: time="2024-02-13T08:29:18.846887102Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 08:29:18.846894 env[1651]: time="2024-02-13T08:29:18.846892397Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 08:29:18.848398 env[1651]: time="2024-02-13T08:29:18.848387903Z" level=info msg="parsed scheme: \"unix\"" module=grpc Feb 13 08:29:18.848398 env[1651]: time="2024-02-13T08:29:18.848396951Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Feb 13 08:29:18.848488 env[1651]: time="2024-02-13T08:29:18.848411554Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Feb 13 08:29:18.848488 env[1651]: time="2024-02-13T08:29:18.848434100Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Feb 13 08:29:18.886571 env[1651]: time="2024-02-13T08:29:18.886529935Z" level=info msg="Loading containers: start." Feb 13 08:29:18.916000 audit[1696]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1696 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:18.916000 audit[1696]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffcc80d2760 a2=0 a3=7ffcc80d274c items=0 ppid=1651 pid=1696 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.034273 kernel: audit: type=1325 audit(1707812958.916:190): table=nat:2 family=2 entries=2 op=nft_register_chain pid=1696 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.034310 kernel: audit: type=1300 audit(1707812958.916:190): arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffcc80d2760 a2=0 a3=7ffcc80d274c items=0 ppid=1651 pid=1696 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.034326 kernel: audit: type=1327 audit(1707812958.916:190): proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Feb 13 08:29:18.916000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Feb 13 08:29:19.083653 kernel: audit: type=1325 audit(1707812958.917:191): table=filter:3 family=2 entries=2 op=nft_register_chain pid=1698 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:18.917000 audit[1698]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1698 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.136043 kernel: audit: type=1300 audit(1707812958.917:191): arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffd43407be0 a2=0 a3=7ffd43407bcc items=0 ppid=1651 pid=1698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:18.917000 audit[1698]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffd43407be0 a2=0 a3=7ffd43407bcc items=0 ppid=1651 pid=1698 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.227322 kernel: audit: type=1327 audit(1707812958.917:191): proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Feb 13 08:29:18.917000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Feb 13 08:29:19.282384 kernel: audit: type=1325 audit(1707812958.918:192): table=filter:4 family=2 entries=1 op=nft_register_chain pid=1700 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:18.918000 audit[1700]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1700 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.339771 kernel: audit: type=1300 audit(1707812958.918:192): arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffe8282490 a2=0 a3=7fffe828247c items=0 ppid=1651 pid=1700 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:18.918000 audit[1700]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffe8282490 a2=0 a3=7fffe828247c items=0 ppid=1651 pid=1700 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.436450 kernel: audit: type=1327 audit(1707812958.918:192): proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 08:29:18.918000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 08:29:18.918000 audit[1702]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1702 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:18.918000 audit[1702]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffb7abe320 a2=0 a3=7fffb7abe30c items=0 ppid=1651 pid=1702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:18.918000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 08:29:18.920000 audit[1704]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1704 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:18.920000 audit[1704]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffc26631a0 a2=0 a3=7fffc266318c items=0 ppid=1651 pid=1704 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:18.920000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Feb 13 08:29:19.507000 audit[1709]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1709 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.507000 audit[1709]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffca4270e90 a2=0 a3=7ffca4270e7c items=0 ppid=1651 pid=1709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.507000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Feb 13 08:29:19.510000 audit[1711]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1711 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.510000 audit[1711]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe8cc3a890 a2=0 a3=7ffe8cc3a87c items=0 ppid=1651 pid=1711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.510000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Feb 13 08:29:19.511000 audit[1713]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1713 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.511000 audit[1713]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc28a00f20 a2=0 a3=7ffc28a00f0c items=0 ppid=1651 pid=1713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.511000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Feb 13 08:29:19.512000 audit[1715]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1715 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.512000 audit[1715]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7fffb7145920 a2=0 a3=7fffb714590c items=0 ppid=1651 pid=1715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.512000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:29:19.515000 audit[1719]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1719 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.515000 audit[1719]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffde2babf80 a2=0 a3=7ffde2babf6c items=0 ppid=1651 pid=1719 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.515000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:29:19.515000 audit[1720]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1720 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.515000 audit[1720]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffee5bd15f0 a2=0 a3=7ffee5bd15dc items=0 ppid=1651 pid=1720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.515000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:29:19.548475 kernel: Initializing XFRM netlink socket Feb 13 08:29:19.569680 env[1651]: time="2024-02-13T08:29:19.569661178Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Feb 13 08:29:19.570364 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Feb 13 08:29:19.579000 audit[1728]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1728 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.579000 audit[1728]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffc744ae130 a2=0 a3=7ffc744ae11c items=0 ppid=1651 pid=1728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.579000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Feb 13 08:29:19.590000 audit[1731]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1731 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.590000 audit[1731]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7fff99697190 a2=0 a3=7fff9969717c items=0 ppid=1651 pid=1731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.590000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Feb 13 08:29:19.591000 audit[1734]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1734 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.591000 audit[1734]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff6f895080 a2=0 a3=7fff6f89506c items=0 ppid=1651 pid=1734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.591000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Feb 13 08:29:19.592000 audit[1736]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1736 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.592000 audit[1736]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff08378320 a2=0 a3=7fff0837830c items=0 ppid=1651 pid=1736 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.592000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Feb 13 08:29:19.593000 audit[1738]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1738 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.593000 audit[1738]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffc75e1b930 a2=0 a3=7ffc75e1b91c items=0 ppid=1651 pid=1738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.593000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Feb 13 08:29:19.594000 audit[1740]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1740 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.594000 audit[1740]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffe55ce7d00 a2=0 a3=7ffe55ce7cec items=0 ppid=1651 pid=1740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.594000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Feb 13 08:29:19.595000 audit[1742]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1742 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.595000 audit[1742]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7fff32891d10 a2=0 a3=7fff32891cfc items=0 ppid=1651 pid=1742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.595000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Feb 13 08:29:19.600000 audit[1745]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1745 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.600000 audit[1745]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffe80e92d70 a2=0 a3=7ffe80e92d5c items=0 ppid=1651 pid=1745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.600000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Feb 13 08:29:19.602000 audit[1747]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1747 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.602000 audit[1747]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7ffe61bcbed0 a2=0 a3=7ffe61bcbebc items=0 ppid=1651 pid=1747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.602000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Feb 13 08:29:19.603000 audit[1749]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1749 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.603000 audit[1749]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffe21c49020 a2=0 a3=7ffe21c4900c items=0 ppid=1651 pid=1749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.603000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Feb 13 08:29:19.604000 audit[1751]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1751 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.604000 audit[1751]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe05c80ef0 a2=0 a3=7ffe05c80edc items=0 ppid=1651 pid=1751 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.604000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Feb 13 08:29:19.605904 systemd-networkd[1312]: docker0: Link UP Feb 13 08:29:19.608000 audit[1755]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1755 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.608000 audit[1755]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff1f77abe0 a2=0 a3=7fff1f77abcc items=0 ppid=1651 pid=1755 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.608000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:29:19.608000 audit[1756]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1756 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:19.608000 audit[1756]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd28713d40 a2=0 a3=7ffd28713d2c items=0 ppid=1651 pid=1756 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:19.608000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Feb 13 08:29:19.610508 env[1651]: time="2024-02-13T08:29:19.610456653Z" level=info msg="Loading containers: done." Feb 13 08:29:19.618344 env[1651]: time="2024-02-13T08:29:19.618294025Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 08:29:19.618440 env[1651]: time="2024-02-13T08:29:19.618406041Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Feb 13 08:29:19.618481 env[1651]: time="2024-02-13T08:29:19.618472365Z" level=info msg="Daemon has completed initialization" Feb 13 08:29:19.626810 systemd[1]: Started docker.service. Feb 13 08:29:19.625000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:19.632456 env[1651]: time="2024-02-13T08:29:19.632416800Z" level=info msg="API listen on /run/docker.sock" Feb 13 08:29:19.654113 systemd[1]: Reloading. Feb 13 08:29:19.718467 /usr/lib/systemd/system-generators/torcx-generator[1808]: time="2024-02-13T08:29:19Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:29:19.718490 /usr/lib/systemd/system-generators/torcx-generator[1808]: time="2024-02-13T08:29:19Z" level=info msg="torcx already run" Feb 13 08:29:19.284972 systemd-resolved[1418]: Clock change detected. Flushing caches. Feb 13 08:29:19.321854 systemd-journald[1257]: Time jumped backwards, rotating. Feb 13 08:29:19.285051 systemd-timesyncd[1419]: Contacted time server [2604:2dc0:202:300::140d]:123 (2.flatcar.pool.ntp.org). Feb 13 08:29:19.285077 systemd-timesyncd[1419]: Initial clock synchronization to Tue 2024-02-13 08:29:19.284947 UTC. Feb 13 08:29:19.299605 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:29:19.299611 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:29:19.311784 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.353000 audit: BPF prog-id=37 op=LOAD Feb 13 08:29:19.353000 audit: BPF prog-id=26 op=UNLOAD Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit: BPF prog-id=38 op=LOAD Feb 13 08:29:19.354000 audit: BPF prog-id=35 op=UNLOAD Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit: BPF prog-id=39 op=LOAD Feb 13 08:29:19.354000 audit: BPF prog-id=32 op=UNLOAD Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit: BPF prog-id=40 op=LOAD Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.354000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit: BPF prog-id=41 op=LOAD Feb 13 08:29:19.355000 audit: BPF prog-id=33 op=UNLOAD Feb 13 08:29:19.355000 audit: BPF prog-id=34 op=UNLOAD Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit: BPF prog-id=42 op=LOAD Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.355000 audit: BPF prog-id=43 op=LOAD Feb 13 08:29:19.355000 audit: BPF prog-id=24 op=UNLOAD Feb 13 08:29:19.355000 audit: BPF prog-id=25 op=UNLOAD Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit: BPF prog-id=44 op=LOAD Feb 13 08:29:19.356000 audit: BPF prog-id=21 op=UNLOAD Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit: BPF prog-id=45 op=LOAD Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.356000 audit: BPF prog-id=46 op=LOAD Feb 13 08:29:19.356000 audit: BPF prog-id=22 op=UNLOAD Feb 13 08:29:19.356000 audit: BPF prog-id=23 op=UNLOAD Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit: BPF prog-id=47 op=LOAD Feb 13 08:29:19.357000 audit: BPF prog-id=30 op=UNLOAD Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit: BPF prog-id=48 op=LOAD Feb 13 08:29:19.357000 audit: BPF prog-id=27 op=UNLOAD Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.357000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit: BPF prog-id=49 op=LOAD Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit: BPF prog-id=50 op=LOAD Feb 13 08:29:19.358000 audit: BPF prog-id=28 op=UNLOAD Feb 13 08:29:19.358000 audit: BPF prog-id=29 op=UNLOAD Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:19.358000 audit: BPF prog-id=51 op=LOAD Feb 13 08:29:19.358000 audit: BPF prog-id=31 op=UNLOAD Feb 13 08:29:19.364756 systemd[1]: Started kubelet.service. Feb 13 08:29:19.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:19.387502 kubelet[1868]: E0213 08:29:19.387445 1868 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 13 08:29:19.389243 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 08:29:19.389314 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 08:29:19.387000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:29:19.979813 env[1475]: time="2024-02-13T08:29:19.979687012Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\"" Feb 13 08:29:20.673739 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3723132258.mount: Deactivated successfully. Feb 13 08:29:21.914516 env[1475]: time="2024-02-13T08:29:21.914479758Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:21.915237 env[1475]: time="2024-02-13T08:29:21.915225132Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:21.916206 env[1475]: time="2024-02-13T08:29:21.916194043Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:21.917102 env[1475]: time="2024-02-13T08:29:21.917087864Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:2f28bed4096abd572a56595ac0304238bdc271dcfe22c650707c09bf97ec16fd,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:21.917570 env[1475]: time="2024-02-13T08:29:21.917511185Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.26.13\" returns image reference \"sha256:84900298406b2df97ade16b73c49c2b73265ded8735ac19a4e20c2a4ad65853f\"" Feb 13 08:29:21.922837 env[1475]: time="2024-02-13T08:29:21.922824533Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\"" Feb 13 08:29:23.536476 env[1475]: time="2024-02-13T08:29:23.536427606Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:23.537091 env[1475]: time="2024-02-13T08:29:23.537049772Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:23.538091 env[1475]: time="2024-02-13T08:29:23.538044675Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:23.539434 env[1475]: time="2024-02-13T08:29:23.539380220Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:fda420c6c15cdd01c4eba3404f0662fe486a9c7f38fa13c741a21334673841a2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:23.539824 env[1475]: time="2024-02-13T08:29:23.539774852Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.26.13\" returns image reference \"sha256:921f237b560bdb02300f82d3606635d395b20635512fab10f0191cff42079486\"" Feb 13 08:29:23.547931 env[1475]: time="2024-02-13T08:29:23.547900472Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\"" Feb 13 08:29:24.620147 env[1475]: time="2024-02-13T08:29:24.620084765Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:24.620761 env[1475]: time="2024-02-13T08:29:24.620698197Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:24.621789 env[1475]: time="2024-02-13T08:29:24.621754716Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:24.622837 env[1475]: time="2024-02-13T08:29:24.622795183Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c3c7303ee6d01c8e5a769db28661cf854b55175aa72c67e9b6a7b9d47ac42af3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:24.623248 env[1475]: time="2024-02-13T08:29:24.623209575Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.26.13\" returns image reference \"sha256:4fe82b56f06250b6b7eb3d5a879cd2cfabf41cb3e45b24af6059eadbc3b8026e\"" Feb 13 08:29:24.632670 env[1475]: time="2024-02-13T08:29:24.632624499Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\"" Feb 13 08:29:25.618489 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1730620940.mount: Deactivated successfully. Feb 13 08:29:25.906053 env[1475]: time="2024-02-13T08:29:25.905962304Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:25.906605 env[1475]: time="2024-02-13T08:29:25.906570323Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:25.907192 env[1475]: time="2024-02-13T08:29:25.907147778Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.26.13,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:25.908084 env[1475]: time="2024-02-13T08:29:25.908038645Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:f6e0de32a002b910b9b2e0e8d769e2d7b05208240559c745ce4781082ab15f22,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:25.908298 env[1475]: time="2024-02-13T08:29:25.908263127Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.26.13\" returns image reference \"sha256:5a7325fa2b6e8d712e4a770abb4a5a5852e87b6de8df34552d67853e9bfb9f9f\"" Feb 13 08:29:25.914456 env[1475]: time="2024-02-13T08:29:25.914438328Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 13 08:29:26.532629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4271291.mount: Deactivated successfully. Feb 13 08:29:26.534176 env[1475]: time="2024-02-13T08:29:26.534137353Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:26.534841 env[1475]: time="2024-02-13T08:29:26.534801200Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:26.535493 env[1475]: time="2024-02-13T08:29:26.535451425Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:26.536227 env[1475]: time="2024-02-13T08:29:26.536188058Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:26.536514 env[1475]: time="2024-02-13T08:29:26.536470690Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 13 08:29:26.542153 env[1475]: time="2024-02-13T08:29:26.542136587Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\"" Feb 13 08:29:27.233580 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3717927626.mount: Deactivated successfully. Feb 13 08:29:29.415100 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 08:29:29.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:29.415234 systemd[1]: Stopped kubelet.service. Feb 13 08:29:29.416125 systemd[1]: Started kubelet.service. Feb 13 08:29:29.441029 kubelet[1962]: E0213 08:29:29.441003 1962 run.go:74] "command failed" err="failed to validate kubelet flags: the container runtime endpoint address was not specified or empty, use --container-runtime-endpoint to set" Feb 13 08:29:29.442417 kernel: kauditd_printk_skb: 238 callbacks suppressed Feb 13 08:29:29.442435 kernel: audit: type=1130 audit(1707812969.413:389): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:29.443315 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 08:29:29.443384 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 08:29:29.413000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:29.508010 kernel: audit: type=1131 audit(1707812969.413:390): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:29.414000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:29.634577 kernel: audit: type=1130 audit(1707812969.414:391): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:29.634609 kernel: audit: type=1131 audit(1707812969.441:392): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:29:29.441000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Feb 13 08:29:30.082629 env[1475]: time="2024-02-13T08:29:30.082579862Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:30.083230 env[1475]: time="2024-02-13T08:29:30.083190314Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:30.084187 env[1475]: time="2024-02-13T08:29:30.084144260Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.6-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:30.084995 env[1475]: time="2024-02-13T08:29:30.084958974Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:dd75ec974b0a2a6f6bb47001ba09207976e625db898d1b16735528c009cb171c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:30.086471 env[1475]: time="2024-02-13T08:29:30.086434753Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.6-0\" returns image reference \"sha256:fce326961ae2d51a5f726883fd59d2a8c2ccc3e45d3bb859882db58e422e59e7\"" Feb 13 08:29:30.091782 env[1475]: time="2024-02-13T08:29:30.091767047Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\"" Feb 13 08:29:30.584904 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4052429766.mount: Deactivated successfully. Feb 13 08:29:31.014417 env[1475]: time="2024-02-13T08:29:31.014343143Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:31.015141 env[1475]: time="2024-02-13T08:29:31.015099522Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:31.015919 env[1475]: time="2024-02-13T08:29:31.015873215Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.9.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:31.016997 env[1475]: time="2024-02-13T08:29:31.016917832Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:8e352a029d304ca7431c6507b56800636c321cb52289686a581ab70aaa8a2e2a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:31.017268 env[1475]: time="2024-02-13T08:29:31.017232553Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.9.3\" returns image reference \"sha256:5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a\"" Feb 13 08:29:32.619895 systemd[1]: Stopped kubelet.service. Feb 13 08:29:32.618000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:32.627700 systemd[1]: Reloading. Feb 13 08:29:32.656551 /usr/lib/systemd/system-generators/torcx-generator[2120]: time="2024-02-13T08:29:32Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:29:32.656575 /usr/lib/systemd/system-generators/torcx-generator[2120]: time="2024-02-13T08:29:32Z" level=info msg="torcx already run" Feb 13 08:29:32.618000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:32.686029 kernel: audit: type=1130 audit(1707812972.618:393): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:32.686072 kernel: audit: type=1131 audit(1707812972.618:394): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:32.769469 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:29:32.769477 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:29:32.781648 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.951193 kernel: audit: type=1400 audit(1707812972.823:395): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.951261 kernel: audit: type=1400 audit(1707812972.823:396): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.951279 kernel: audit: type=1400 audit(1707812972.823:397): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.013891 kernel: audit: type=1400 audit(1707812972.823:398): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.823000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit: BPF prog-id=52 op=LOAD Feb 13 08:29:32.949000 audit: BPF prog-id=37 op=UNLOAD Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:32.949000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit: BPF prog-id=53 op=LOAD Feb 13 08:29:33.076000 audit: BPF prog-id=38 op=UNLOAD Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.076000 audit: BPF prog-id=54 op=LOAD Feb 13 08:29:33.076000 audit: BPF prog-id=39 op=UNLOAD Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit: BPF prog-id=55 op=LOAD Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.077000 audit: BPF prog-id=56 op=LOAD Feb 13 08:29:33.077000 audit: BPF prog-id=40 op=UNLOAD Feb 13 08:29:33.077000 audit: BPF prog-id=41 op=UNLOAD Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit: BPF prog-id=57 op=LOAD Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.078000 audit: BPF prog-id=58 op=LOAD Feb 13 08:29:33.078000 audit: BPF prog-id=42 op=UNLOAD Feb 13 08:29:33.078000 audit: BPF prog-id=43 op=UNLOAD Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit: BPF prog-id=59 op=LOAD Feb 13 08:29:33.079000 audit: BPF prog-id=44 op=UNLOAD Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit: BPF prog-id=60 op=LOAD Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit: BPF prog-id=61 op=LOAD Feb 13 08:29:33.079000 audit: BPF prog-id=45 op=UNLOAD Feb 13 08:29:33.079000 audit: BPF prog-id=46 op=UNLOAD Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.079000 audit: BPF prog-id=62 op=LOAD Feb 13 08:29:33.079000 audit: BPF prog-id=47 op=UNLOAD Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit: BPF prog-id=63 op=LOAD Feb 13 08:29:33.080000 audit: BPF prog-id=48 op=UNLOAD Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit: BPF prog-id=64 op=LOAD Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.080000 audit: BPF prog-id=65 op=LOAD Feb 13 08:29:33.080000 audit: BPF prog-id=49 op=UNLOAD Feb 13 08:29:33.080000 audit: BPF prog-id=50 op=UNLOAD Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.081000 audit: BPF prog-id=66 op=LOAD Feb 13 08:29:33.081000 audit: BPF prog-id=51 op=UNLOAD Feb 13 08:29:33.090271 systemd[1]: Started kubelet.service. Feb 13 08:29:33.088000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:33.113332 kubelet[2180]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:29:33.113332 kubelet[2180]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:29:33.113332 kubelet[2180]: I0213 08:29:33.113318 2180 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 08:29:33.114063 kubelet[2180]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:29:33.114063 kubelet[2180]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:29:33.302821 kubelet[2180]: I0213 08:29:33.302766 2180 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 13 08:29:33.302821 kubelet[2180]: I0213 08:29:33.302775 2180 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 08:29:33.302873 kubelet[2180]: I0213 08:29:33.302869 2180 server.go:836] "Client rotation is on, will bootstrap in background" Feb 13 08:29:33.304544 kubelet[2180]: I0213 08:29:33.304533 2180 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 08:29:33.305288 kubelet[2180]: E0213 08:29:33.305277 2180 certificate_manager.go:471] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://145.40.67.89:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.322563 kubelet[2180]: I0213 08:29:33.322536 2180 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 08:29:33.322694 kubelet[2180]: I0213 08:29:33.322644 2180 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 08:29:33.322714 kubelet[2180]: I0213 08:29:33.322695 2180 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:systemd KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:} {Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 08:29:33.322714 kubelet[2180]: I0213 08:29:33.322706 2180 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 08:29:33.322714 kubelet[2180]: I0213 08:29:33.322713 2180 container_manager_linux.go:308] "Creating device plugin manager" Feb 13 08:29:33.322801 kubelet[2180]: I0213 08:29:33.322761 2180 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:29:33.324495 kubelet[2180]: I0213 08:29:33.324487 2180 kubelet.go:398] "Attempting to sync node with API server" Feb 13 08:29:33.324525 kubelet[2180]: I0213 08:29:33.324498 2180 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 08:29:33.324525 kubelet[2180]: I0213 08:29:33.324509 2180 kubelet.go:297] "Adding apiserver pod source" Feb 13 08:29:33.324525 kubelet[2180]: I0213 08:29:33.324516 2180 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 08:29:33.324886 kubelet[2180]: W0213 08:29:33.324863 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://145.40.67.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-5e41ede811&limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.324886 kubelet[2180]: W0213 08:29:33.324868 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://145.40.67.89:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.324933 kubelet[2180]: E0213 08:29:33.324898 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://145.40.67.89:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.324933 kubelet[2180]: E0213 08:29:33.324899 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://145.40.67.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-5e41ede811&limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.324933 kubelet[2180]: I0213 08:29:33.324899 2180 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 08:29:33.325133 kubelet[2180]: W0213 08:29:33.325126 2180 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 08:29:33.325391 kubelet[2180]: I0213 08:29:33.325385 2180 server.go:1186] "Started kubelet" Feb 13 08:29:33.325466 kubelet[2180]: I0213 08:29:33.325455 2180 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 08:29:33.325661 kubelet[2180]: E0213 08:29:33.325648 2180 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 08:29:33.325690 kubelet[2180]: E0213 08:29:33.325668 2180 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 08:29:33.325708 kubelet[2180]: E0213 08:29:33.325639 2180 event.go:276] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ci-3510.3.2-a-5e41ede811.17b35ed9d1dcd008", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"ci-3510.3.2-a-5e41ede811", UID:"ci-3510.3.2-a-5e41ede811", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"ci-3510.3.2-a-5e41ede811"}, FirstTimestamp:time.Date(2024, time.February, 13, 8, 29, 33, 325373448, time.Local), LastTimestamp:time.Date(2024, time.February, 13, 8, 29, 33, 325373448, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'Post "https://145.40.67.89:6443/api/v1/namespaces/default/events": dial tcp 145.40.67.89:6443: connect: connection refused'(may retry after sleeping) Feb 13 08:29:33.324000 audit[2180]: AVC avc: denied { mac_admin } for pid=2180 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.324000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:29:33.324000 audit[2180]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c00149a630 a1=c00149c108 a2=c00149a600 a3=25 items=0 ppid=1 pid=2180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.324000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:29:33.326300 kubelet[2180]: I0213 08:29:33.326179 2180 server.go:451] "Adding debug handlers to kubelet server" Feb 13 08:29:33.326300 kubelet[2180]: I0213 08:29:33.326236 2180 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 08:29:33.326300 kubelet[2180]: I0213 08:29:33.326279 2180 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 08:29:33.326376 kubelet[2180]: I0213 08:29:33.326327 2180 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 08:29:33.326576 kubelet[2180]: I0213 08:29:33.326563 2180 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 08:29:33.326576 kubelet[2180]: I0213 08:29:33.326570 2180 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 13 08:29:33.324000 audit[2180]: AVC avc: denied { mac_admin } for pid=2180 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.324000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:29:33.324000 audit[2180]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000665e60 a1=c00011ba10 a2=c00139c300 a3=25 items=0 ppid=1 pid=2180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.324000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:29:33.326918 kubelet[2180]: E0213 08:29:33.326760 2180 controller.go:146] failed to ensure lease exists, will retry in 200ms, error: Get "https://145.40.67.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-5e41ede811?timeout=10s": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.326918 kubelet[2180]: W0213 08:29:33.326796 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://145.40.67.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.326918 kubelet[2180]: E0213 08:29:33.326827 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://145.40.67.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.325000 audit[2204]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=2204 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.325000 audit[2204]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe1bbd9d10 a2=0 a3=7ffe1bbd9cfc items=0 ppid=2180 pid=2204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.325000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 08:29:33.326000 audit[2205]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=2205 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.326000 audit[2205]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc19a04670 a2=0 a3=7ffc19a0465c items=0 ppid=2180 pid=2205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.326000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 08:29:33.327000 audit[2207]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=2207 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.327000 audit[2207]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff574fccf0 a2=0 a3=7fff574fccdc items=0 ppid=2180 pid=2207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.327000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 08:29:33.329000 audit[2209]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=2209 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.329000 audit[2209]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff97aa6c80 a2=0 a3=7fff97aa6c6c items=0 ppid=2180 pid=2209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.329000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Feb 13 08:29:33.331000 audit[2212]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2212 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.331000 audit[2212]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffd9efac410 a2=0 a3=7ffd9efac3fc items=0 ppid=2180 pid=2212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.331000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Feb 13 08:29:33.331000 audit[2213]: NETFILTER_CFG table=nat:31 family=2 entries=1 op=nft_register_chain pid=2213 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.331000 audit[2213]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffff3b4bee0 a2=0 a3=7ffff3b4becc items=0 ppid=2180 pid=2213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.331000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 13 08:29:33.334000 audit[2217]: NETFILTER_CFG table=nat:32 family=2 entries=1 op=nft_register_rule pid=2217 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.334000 audit[2217]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffcc7b15430 a2=0 a3=7ffcc7b1541c items=0 ppid=2180 pid=2217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.334000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 13 08:29:33.336000 audit[2220]: NETFILTER_CFG table=filter:33 family=2 entries=1 op=nft_register_rule pid=2220 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.336000 audit[2220]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7ffe7b417690 a2=0 a3=7ffe7b41767c items=0 ppid=2180 pid=2220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.336000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 13 08:29:33.337000 audit[2221]: NETFILTER_CFG table=nat:34 family=2 entries=1 op=nft_register_chain pid=2221 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.337000 audit[2221]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcaafba4e0 a2=0 a3=7ffcaafba4cc items=0 ppid=2180 pid=2221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.337000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 13 08:29:33.337000 audit[2222]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_chain pid=2222 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.337000 audit[2222]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff6e03b7b0 a2=0 a3=7fff6e03b79c items=0 ppid=2180 pid=2222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.337000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 08:29:33.338000 audit[2224]: NETFILTER_CFG table=nat:36 family=2 entries=1 op=nft_register_rule pid=2224 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.338000 audit[2224]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd21cc1540 a2=0 a3=7ffd21cc152c items=0 ppid=2180 pid=2224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.338000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 13 08:29:33.339000 audit[2226]: NETFILTER_CFG table=nat:37 family=2 entries=1 op=nft_register_rule pid=2226 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.339000 audit[2226]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fffc3838ce0 a2=0 a3=7fffc3838ccc items=0 ppid=2180 pid=2226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.339000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 08:29:33.340000 audit[2228]: NETFILTER_CFG table=nat:38 family=2 entries=1 op=nft_register_rule pid=2228 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.340000 audit[2228]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7fffb156a010 a2=0 a3=7fffb1569ffc items=0 ppid=2180 pid=2228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.340000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 13 08:29:33.343000 audit[2230]: NETFILTER_CFG table=nat:39 family=2 entries=1 op=nft_register_rule pid=2230 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.343000 audit[2230]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7fff60c2c590 a2=0 a3=7fff60c2c57c items=0 ppid=2180 pid=2230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.343000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 13 08:29:33.343000 audit[2232]: NETFILTER_CFG table=nat:40 family=2 entries=1 op=nft_register_rule pid=2232 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.343000 audit[2232]: SYSCALL arch=c000003e syscall=46 success=yes exit=540 a0=3 a1=7ffc05c934b0 a2=0 a3=7ffc05c9349c items=0 ppid=2180 pid=2232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.343000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 13 08:29:33.345387 kubelet[2180]: I0213 08:29:33.345366 2180 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 08:29:33.343000 audit[2233]: NETFILTER_CFG table=mangle:41 family=10 entries=2 op=nft_register_chain pid=2233 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.343000 audit[2233]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffc94f70300 a2=0 a3=7ffc94f702ec items=0 ppid=2180 pid=2233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.343000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Feb 13 08:29:33.343000 audit[2234]: NETFILTER_CFG table=mangle:42 family=2 entries=1 op=nft_register_chain pid=2234 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.343000 audit[2234]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffca0fb8070 a2=0 a3=7ffca0fb805c items=0 ppid=2180 pid=2234 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.343000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 08:29:33.344000 audit[2235]: NETFILTER_CFG table=nat:43 family=10 entries=2 op=nft_register_chain pid=2235 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.344000 audit[2235]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe4d22b4f0 a2=0 a3=7ffe4d22b4dc items=0 ppid=2180 pid=2235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.344000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D44524F50002D74006E6174 Feb 13 08:29:33.344000 audit[2236]: NETFILTER_CFG table=nat:44 family=2 entries=1 op=nft_register_chain pid=2236 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.344000 audit[2236]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcc7da4000 a2=0 a3=7ffcc7da3fec items=0 ppid=2180 pid=2236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.344000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 08:29:33.344000 audit[2238]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_chain pid=2238 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:33.344000 audit[2238]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffef2c41560 a2=0 a3=7ffef2c4154c items=0 ppid=2180 pid=2238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.344000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 08:29:33.345000 audit[2239]: NETFILTER_CFG table=nat:46 family=10 entries=1 op=nft_register_rule pid=2239 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.345000 audit[2239]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffe35ab6720 a2=0 a3=7ffe35ab670c items=0 ppid=2180 pid=2239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.345000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D44524F50002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303038303030 Feb 13 08:29:33.345000 audit[2240]: NETFILTER_CFG table=filter:47 family=10 entries=2 op=nft_register_chain pid=2240 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.345000 audit[2240]: SYSCALL arch=c000003e syscall=46 success=yes exit=132 a0=3 a1=7ffe92a2b0f0 a2=0 a3=7ffe92a2b0dc items=0 ppid=2180 pid=2240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.345000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Feb 13 08:29:33.347000 audit[2242]: NETFILTER_CFG table=filter:48 family=10 entries=1 op=nft_register_rule pid=2242 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.347000 audit[2242]: SYSCALL arch=c000003e syscall=46 success=yes exit=664 a0=3 a1=7ffd9a4fea80 a2=0 a3=7ffd9a4fea6c items=0 ppid=2180 pid=2242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.347000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206669726577616C6C20666F722064726F7070696E67206D61726B6564207061636B657473002D6D006D61726B Feb 13 08:29:33.347000 audit[2243]: NETFILTER_CFG table=nat:49 family=10 entries=1 op=nft_register_chain pid=2243 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.347000 audit[2243]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff9286c060 a2=0 a3=7fff9286c04c items=0 ppid=2180 pid=2243 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.347000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4D41524B2D4D415351002D74006E6174 Feb 13 08:29:33.347000 audit[2244]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2244 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.347000 audit[2244]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdfea37f80 a2=0 a3=7ffdfea37f6c items=0 ppid=2180 pid=2244 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.347000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Feb 13 08:29:33.349000 audit[2246]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_rule pid=2246 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.349000 audit[2246]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffef9ce5770 a2=0 a3=7ffef9ce575c items=0 ppid=2180 pid=2246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.349000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D4D41524B2D4D415351002D74006E6174002D6A004D41524B002D2D6F722D6D61726B0030783030303034303030 Feb 13 08:29:33.349000 audit[2248]: NETFILTER_CFG table=nat:52 family=10 entries=2 op=nft_register_chain pid=2248 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.349000 audit[2248]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffdb9e7b200 a2=0 a3=7ffdb9e7b1ec items=0 ppid=2180 pid=2248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.349000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Feb 13 08:29:33.350000 audit[2250]: NETFILTER_CFG table=nat:53 family=10 entries=1 op=nft_register_rule pid=2250 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.350000 audit[2250]: SYSCALL arch=c000003e syscall=46 success=yes exit=364 a0=3 a1=7ffc4b7cf810 a2=0 a3=7ffc4b7cf7fc items=0 ppid=2180 pid=2250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.350000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D006D61726B0000002D2D6D61726B00307830303030343030302F30783030303034303030002D6A0052455455524E Feb 13 08:29:33.353000 audit[2252]: NETFILTER_CFG table=nat:54 family=10 entries=1 op=nft_register_rule pid=2252 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.353000 audit[2252]: SYSCALL arch=c000003e syscall=46 success=yes exit=220 a0=3 a1=7ffd4505f010 a2=0 a3=7ffd4505effc items=0 ppid=2180 pid=2252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.353000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6A004D41524B002D2D786F722D6D61726B0030783030303034303030 Feb 13 08:29:33.354000 audit[2254]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_rule pid=2254 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.354000 audit[2254]: SYSCALL arch=c000003e syscall=46 success=yes exit=556 a0=3 a1=7ffd828c47d0 a2=0 a3=7ffd828c47bc items=0 ppid=2180 pid=2254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.354000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D41004B5542452D504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732073657276696365207472616666696320726571756972696E6720534E4154002D6A004D415351554552414445 Feb 13 08:29:33.356122 kubelet[2180]: I0213 08:29:33.356009 2180 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 08:29:33.356122 kubelet[2180]: I0213 08:29:33.356021 2180 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 13 08:29:33.356122 kubelet[2180]: I0213 08:29:33.356035 2180 kubelet.go:2113] "Starting kubelet main sync loop" Feb 13 08:29:33.356122 kubelet[2180]: E0213 08:29:33.356066 2180 kubelet.go:2137] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 08:29:33.356289 kubelet[2180]: W0213 08:29:33.356274 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://145.40.67.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.356312 kubelet[2180]: E0213 08:29:33.356298 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://145.40.67.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.355000 audit[2255]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=2255 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.355000 audit[2255]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff7c8d9b20 a2=0 a3=7fff7c8d9b0c items=0 ppid=2180 pid=2255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.355000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Feb 13 08:29:33.355000 audit[2256]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=2256 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.355000 audit[2256]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff4d1a7370 a2=0 a3=7fff4d1a735c items=0 ppid=2180 pid=2256 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.355000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Feb 13 08:29:33.355000 audit[2257]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=2257 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:33.355000 audit[2257]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffec788c440 a2=0 a3=7ffec788c42c items=0 ppid=2180 pid=2257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.355000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Feb 13 08:29:33.382534 kubelet[2180]: I0213 08:29:33.382492 2180 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 08:29:33.382534 kubelet[2180]: I0213 08:29:33.382507 2180 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 08:29:33.382534 kubelet[2180]: I0213 08:29:33.382518 2180 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:29:33.383460 kubelet[2180]: I0213 08:29:33.383424 2180 policy_none.go:49] "None policy: Start" Feb 13 08:29:33.383829 kubelet[2180]: I0213 08:29:33.383790 2180 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 08:29:33.383829 kubelet[2180]: I0213 08:29:33.383804 2180 state_mem.go:35] "Initializing new in-memory state store" Feb 13 08:29:33.387309 systemd[1]: Created slice kubepods.slice. Feb 13 08:29:33.390872 systemd[1]: Created slice kubepods-burstable.slice. Feb 13 08:29:33.393331 systemd[1]: Created slice kubepods-besteffort.slice. Feb 13 08:29:33.403871 kubelet[2180]: I0213 08:29:33.403815 2180 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 08:29:33.402000 audit[2180]: AVC avc: denied { mac_admin } for pid=2180 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:33.402000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:29:33.402000 audit[2180]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c00161fb00 a1=c001055d10 a2=c00161fad0 a3=25 items=0 ppid=1 pid=2180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:33.402000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:29:33.404268 kubelet[2180]: I0213 08:29:33.403884 2180 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 08:29:33.404268 kubelet[2180]: I0213 08:29:33.404091 2180 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 08:29:33.404541 kubelet[2180]: E0213 08:29:33.404482 2180 eviction_manager.go:261] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:33.430383 kubelet[2180]: I0213 08:29:33.430291 2180 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.430992 kubelet[2180]: E0213 08:29:33.430925 2180 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.89:6443/api/v1/nodes\": dial tcp 145.40.67.89:6443: connect: connection refused" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.457203 kubelet[2180]: I0213 08:29:33.457100 2180 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:33.460498 kubelet[2180]: I0213 08:29:33.460419 2180 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:33.463612 kubelet[2180]: I0213 08:29:33.463564 2180 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:33.464051 kubelet[2180]: I0213 08:29:33.463980 2180 status_manager.go:698] "Failed to get status for pod" podUID=15bd00ad85baf2935512d2bdf5998a02 pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" err="Get \"https://145.40.67.89:6443/api/v1/namespaces/kube-system/pods/kube-apiserver-ci-3510.3.2-a-5e41ede811\": dial tcp 145.40.67.89:6443: connect: connection refused" Feb 13 08:29:33.469894 kubelet[2180]: I0213 08:29:33.469831 2180 status_manager.go:698] "Failed to get status for pod" podUID=da9509f303d934bf8e6e4f732dc10d6f pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" err="Get \"https://145.40.67.89:6443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ci-3510.3.2-a-5e41ede811\": dial tcp 145.40.67.89:6443: connect: connection refused" Feb 13 08:29:33.473481 kubelet[2180]: I0213 08:29:33.473443 2180 status_manager.go:698] "Failed to get status for pod" podUID=85d9a141d1ad4632795a3a88f1cc049e pod="kube-system/kube-scheduler-ci-3510.3.2-a-5e41ede811" err="Get \"https://145.40.67.89:6443/api/v1/namespaces/kube-system/pods/kube-scheduler-ci-3510.3.2-a-5e41ede811\": dial tcp 145.40.67.89:6443: connect: connection refused" Feb 13 08:29:33.479296 systemd[1]: Created slice kubepods-burstable-pod15bd00ad85baf2935512d2bdf5998a02.slice. Feb 13 08:29:33.518662 systemd[1]: Created slice kubepods-burstable-podda9509f303d934bf8e6e4f732dc10d6f.slice. Feb 13 08:29:33.527023 systemd[1]: Created slice kubepods-burstable-pod85d9a141d1ad4632795a3a88f1cc049e.slice. Feb 13 08:29:33.527613 kubelet[2180]: E0213 08:29:33.527507 2180 controller.go:146] failed to ensure lease exists, will retry in 400ms, error: Get "https://145.40.67.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-5e41ede811?timeout=10s": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:33.627789 kubelet[2180]: I0213 08:29:33.627592 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/15bd00ad85baf2935512d2bdf5998a02-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" (UID: \"15bd00ad85baf2935512d2bdf5998a02\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.627789 kubelet[2180]: I0213 08:29:33.627695 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.627789 kubelet[2180]: I0213 08:29:33.627762 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.628218 kubelet[2180]: I0213 08:29:33.627852 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.628218 kubelet[2180]: I0213 08:29:33.627994 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.628218 kubelet[2180]: I0213 08:29:33.628107 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/85d9a141d1ad4632795a3a88f1cc049e-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-5e41ede811\" (UID: \"85d9a141d1ad4632795a3a88f1cc049e\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.628218 kubelet[2180]: I0213 08:29:33.628205 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/15bd00ad85baf2935512d2bdf5998a02-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" (UID: \"15bd00ad85baf2935512d2bdf5998a02\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.628660 kubelet[2180]: I0213 08:29:33.628315 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/15bd00ad85baf2935512d2bdf5998a02-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" (UID: \"15bd00ad85baf2935512d2bdf5998a02\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.628660 kubelet[2180]: I0213 08:29:33.628453 2180 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.633457 kubelet[2180]: I0213 08:29:33.633426 2180 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.633671 kubelet[2180]: E0213 08:29:33.633630 2180 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.89:6443/api/v1/nodes\": dial tcp 145.40.67.89:6443: connect: connection refused" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:33.812974 env[1475]: time="2024-02-13T08:29:33.812812848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-5e41ede811,Uid:15bd00ad85baf2935512d2bdf5998a02,Namespace:kube-system,Attempt:0,}" Feb 13 08:29:33.824995 env[1475]: time="2024-02-13T08:29:33.824884593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-5e41ede811,Uid:da9509f303d934bf8e6e4f732dc10d6f,Namespace:kube-system,Attempt:0,}" Feb 13 08:29:33.832763 env[1475]: time="2024-02-13T08:29:33.832637160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-5e41ede811,Uid:85d9a141d1ad4632795a3a88f1cc049e,Namespace:kube-system,Attempt:0,}" Feb 13 08:29:33.929106 kubelet[2180]: E0213 08:29:33.929009 2180 controller.go:146] failed to ensure lease exists, will retry in 800ms, error: Get "https://145.40.67.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-5e41ede811?timeout=10s": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.037895 kubelet[2180]: I0213 08:29:34.037833 2180 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:34.038562 kubelet[2180]: E0213 08:29:34.038515 2180 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.89:6443/api/v1/nodes\": dial tcp 145.40.67.89:6443: connect: connection refused" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:34.136520 kubelet[2180]: W0213 08:29:34.136369 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://145.40.67.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.136520 kubelet[2180]: E0213 08:29:34.136491 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://145.40.67.89:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.216539 kubelet[2180]: W0213 08:29:34.216307 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: Get "https://145.40.67.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-5e41ede811&limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.216539 kubelet[2180]: E0213 08:29:34.216432 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://145.40.67.89:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.2-a-5e41ede811&limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.335223 kubelet[2180]: W0213 08:29:34.335109 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: Get "https://145.40.67.89:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.335223 kubelet[2180]: E0213 08:29:34.335208 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://145.40.67.89:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.342235 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3330693740.mount: Deactivated successfully. Feb 13 08:29:34.342538 env[1475]: time="2024-02-13T08:29:34.342307149Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.344994 env[1475]: time="2024-02-13T08:29:34.344921380Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.346178 env[1475]: time="2024-02-13T08:29:34.346123518Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.348259 env[1475]: time="2024-02-13T08:29:34.348198690Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.351489 env[1475]: time="2024-02-13T08:29:34.351437545Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.354472 env[1475]: time="2024-02-13T08:29:34.354424450Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.356711 env[1475]: time="2024-02-13T08:29:34.356667427Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.357278 env[1475]: time="2024-02-13T08:29:34.357237367Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.357850 env[1475]: time="2024-02-13T08:29:34.357830842Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.358451 env[1475]: time="2024-02-13T08:29:34.358409199Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.359053 env[1475]: time="2024-02-13T08:29:34.359009113Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.359661 env[1475]: time="2024-02-13T08:29:34.359618217Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:34.363396 env[1475]: time="2024-02-13T08:29:34.363348743Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:34.363396 env[1475]: time="2024-02-13T08:29:34.363381889Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:34.363396 env[1475]: time="2024-02-13T08:29:34.363391679Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:34.363538 env[1475]: time="2024-02-13T08:29:34.363499314Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b7a66e9a1b86daab9a1e8f848b8d748407ec4a631dbded66215f2da9fa6abe02 pid=2272 runtime=io.containerd.runc.v2 Feb 13 08:29:34.364348 env[1475]: time="2024-02-13T08:29:34.364313743Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:34.364348 env[1475]: time="2024-02-13T08:29:34.364340676Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:34.364412 env[1475]: time="2024-02-13T08:29:34.364350409Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:34.364461 env[1475]: time="2024-02-13T08:29:34.364438012Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/a3791c3f47c06d7df8c1b00c0ce990ae49b894d51ecb086ae9a97cfe66853498 pid=2283 runtime=io.containerd.runc.v2 Feb 13 08:29:34.364966 env[1475]: time="2024-02-13T08:29:34.364940413Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:34.364966 env[1475]: time="2024-02-13T08:29:34.364960560Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:34.365034 env[1475]: time="2024-02-13T08:29:34.364969377Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:34.365056 env[1475]: time="2024-02-13T08:29:34.365039074Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/0e670da51d4102e2f793503c79757bcf8792bb48829d79823bc13bbb6cfb76d3 pid=2294 runtime=io.containerd.runc.v2 Feb 13 08:29:34.373438 systemd[1]: Started cri-containerd-0e670da51d4102e2f793503c79757bcf8792bb48829d79823bc13bbb6cfb76d3.scope. Feb 13 08:29:34.374022 systemd[1]: Started cri-containerd-a3791c3f47c06d7df8c1b00c0ce990ae49b894d51ecb086ae9a97cfe66853498.scope. Feb 13 08:29:34.374636 systemd[1]: Started cri-containerd-b7a66e9a1b86daab9a1e8f848b8d748407ec4a631dbded66215f2da9fa6abe02.scope. Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit: BPF prog-id=67 op=LOAD Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c0001bdc48 a2=10 a3=1c items=0 ppid=2294 pid=2316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065363730646135316434313032653266373933353033633739373537 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001bd6b0 a2=3c a3=c items=0 ppid=2294 pid=2316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065363730646135316434313032653266373933353033633739373537 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit: BPF prog-id=68 op=LOAD Feb 13 08:29:34.377000 audit[2316]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bd9d8 a2=78 a3=c00009ca90 items=0 ppid=2294 pid=2316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065363730646135316434313032653266373933353033633739373537 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit: BPF prog-id=69 op=LOAD Feb 13 08:29:34.377000 audit[2316]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c0001bd770 a2=78 a3=c00009cad8 items=0 ppid=2294 pid=2316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065363730646135316434313032653266373933353033633739373537 Feb 13 08:29:34.377000 audit: BPF prog-id=69 op=UNLOAD Feb 13 08:29:34.377000 audit: BPF prog-id=68 op=UNLOAD Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { perfmon } for pid=2316 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[2316]: AVC avc: denied { bpf } for pid=2316 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit: BPF prog-id=70 op=LOAD Feb 13 08:29:34.377000 audit[2316]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bdc30 a2=78 a3=c00009cee8 items=0 ppid=2294 pid=2316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3065363730646135316434313032653266373933353033633739373537 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.377000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit: BPF prog-id=71 op=LOAD Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000117c48 a2=10 a3=1c items=0 ppid=2283 pid=2309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373931633366343763303664376466386331623030633063653939 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001176b0 a2=3c a3=c items=0 ppid=2283 pid=2309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373931633366343763303664376466386331623030633063653939 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit: BPF prog-id=72 op=LOAD Feb 13 08:29:34.378000 audit[2309]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001179d8 a2=78 a3=c00019e6a0 items=0 ppid=2283 pid=2309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373931633366343763303664376466386331623030633063653939 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit: BPF prog-id=73 op=LOAD Feb 13 08:29:34.378000 audit[2309]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000117770 a2=78 a3=c00019e6e8 items=0 ppid=2283 pid=2309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373931633366343763303664376466386331623030633063653939 Feb 13 08:29:34.378000 audit: BPF prog-id=73 op=UNLOAD Feb 13 08:29:34.378000 audit: BPF prog-id=72 op=UNLOAD Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { perfmon } for pid=2309 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[2309]: AVC avc: denied { bpf } for pid=2309 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit: BPF prog-id=74 op=LOAD Feb 13 08:29:34.378000 audit[2309]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000117c30 a2=78 a3=c00019eaf8 items=0 ppid=2283 pid=2309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373931633366343763303664376466386331623030633063653939 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.378000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit: BPF prog-id=75 op=LOAD Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000195c48 a2=10 a3=1c items=0 ppid=2272 pid=2300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237613636653961316238366461616239613165386638343862386437 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001956b0 a2=3c a3=8 items=0 ppid=2272 pid=2300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237613636653961316238366461616239613165386638343862386437 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit: BPF prog-id=76 op=LOAD Feb 13 08:29:34.379000 audit[2300]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001959d8 a2=78 a3=c0000a54b0 items=0 ppid=2272 pid=2300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237613636653961316238366461616239613165386638343862386437 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit: BPF prog-id=77 op=LOAD Feb 13 08:29:34.379000 audit[2300]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000195770 a2=78 a3=c0000a54f8 items=0 ppid=2272 pid=2300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237613636653961316238366461616239613165386638343862386437 Feb 13 08:29:34.379000 audit: BPF prog-id=77 op=UNLOAD Feb 13 08:29:34.379000 audit: BPF prog-id=76 op=UNLOAD Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { perfmon } for pid=2300 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit[2300]: AVC avc: denied { bpf } for pid=2300 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.379000 audit: BPF prog-id=78 op=LOAD Feb 13 08:29:34.379000 audit[2300]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000195c30 a2=78 a3=c0000a5908 items=0 ppid=2272 pid=2300 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237613636653961316238366461616239613165386638343862386437 Feb 13 08:29:34.397432 env[1475]: time="2024-02-13T08:29:34.397403326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.2-a-5e41ede811,Uid:da9509f303d934bf8e6e4f732dc10d6f,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e670da51d4102e2f793503c79757bcf8792bb48829d79823bc13bbb6cfb76d3\"" Feb 13 08:29:34.397931 env[1475]: time="2024-02-13T08:29:34.397912509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.2-a-5e41ede811,Uid:85d9a141d1ad4632795a3a88f1cc049e,Namespace:kube-system,Attempt:0,} returns sandbox id \"a3791c3f47c06d7df8c1b00c0ce990ae49b894d51ecb086ae9a97cfe66853498\"" Feb 13 08:29:34.398791 env[1475]: time="2024-02-13T08:29:34.398777056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.2-a-5e41ede811,Uid:15bd00ad85baf2935512d2bdf5998a02,Namespace:kube-system,Attempt:0,} returns sandbox id \"b7a66e9a1b86daab9a1e8f848b8d748407ec4a631dbded66215f2da9fa6abe02\"" Feb 13 08:29:34.399622 env[1475]: time="2024-02-13T08:29:34.399610484Z" level=info msg="CreateContainer within sandbox \"0e670da51d4102e2f793503c79757bcf8792bb48829d79823bc13bbb6cfb76d3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 08:29:34.399675 env[1475]: time="2024-02-13T08:29:34.399661881Z" level=info msg="CreateContainer within sandbox \"a3791c3f47c06d7df8c1b00c0ce990ae49b894d51ecb086ae9a97cfe66853498\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 08:29:34.400281 env[1475]: time="2024-02-13T08:29:34.400270111Z" level=info msg="CreateContainer within sandbox \"b7a66e9a1b86daab9a1e8f848b8d748407ec4a631dbded66215f2da9fa6abe02\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 08:29:34.406281 env[1475]: time="2024-02-13T08:29:34.406236928Z" level=info msg="CreateContainer within sandbox \"a3791c3f47c06d7df8c1b00c0ce990ae49b894d51ecb086ae9a97cfe66853498\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3a585599a2352804d55c68396dabfa4619ac713c50830c95d5b78c2dd7d16f8a\"" Feb 13 08:29:34.406572 env[1475]: time="2024-02-13T08:29:34.406527593Z" level=info msg="StartContainer for \"3a585599a2352804d55c68396dabfa4619ac713c50830c95d5b78c2dd7d16f8a\"" Feb 13 08:29:34.407856 env[1475]: time="2024-02-13T08:29:34.407811518Z" level=info msg="CreateContainer within sandbox \"0e670da51d4102e2f793503c79757bcf8792bb48829d79823bc13bbb6cfb76d3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"45d0bfbd2c7ea136636809c907327b4cb4539c19867520608208227122b8d74a\"" Feb 13 08:29:34.407988 env[1475]: time="2024-02-13T08:29:34.407952535Z" level=info msg="StartContainer for \"45d0bfbd2c7ea136636809c907327b4cb4539c19867520608208227122b8d74a\"" Feb 13 08:29:34.408176 env[1475]: time="2024-02-13T08:29:34.408129682Z" level=info msg="CreateContainer within sandbox \"b7a66e9a1b86daab9a1e8f848b8d748407ec4a631dbded66215f2da9fa6abe02\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"be21faf5e8939dcbde45f12f937ff4ab89d6977c9ac7f7a7a236b8a2e23c92bb\"" Feb 13 08:29:34.408338 env[1475]: time="2024-02-13T08:29:34.408294929Z" level=info msg="StartContainer for \"be21faf5e8939dcbde45f12f937ff4ab89d6977c9ac7f7a7a236b8a2e23c92bb\"" Feb 13 08:29:34.414594 systemd[1]: Started cri-containerd-3a585599a2352804d55c68396dabfa4619ac713c50830c95d5b78c2dd7d16f8a.scope. Feb 13 08:29:34.416609 systemd[1]: Started cri-containerd-45d0bfbd2c7ea136636809c907327b4cb4539c19867520608208227122b8d74a.scope. Feb 13 08:29:34.417255 systemd[1]: Started cri-containerd-be21faf5e8939dcbde45f12f937ff4ab89d6977c9ac7f7a7a236b8a2e23c92bb.scope. Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.447756 kernel: kauditd_printk_skb: 451 callbacks suppressed Feb 13 08:29:34.447786 kernel: audit: type=1400 audit(1707812974.419:658): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.547609 kubelet[2180]: W0213 08:29:34.547547 2180 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.RuntimeClass: Get "https://145.40.67.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.547609 kubelet[2180]: E0213 08:29:34.547587 2180 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://145.40.67.89:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.573716 kernel: audit: type=1400 audit(1707812974.419:659): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.573750 kernel: audit: type=1400 audit(1707812974.419:660): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.636739 kernel: audit: type=1400 audit(1707812974.419:661): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.730168 kubelet[2180]: E0213 08:29:34.730128 2180 controller.go:146] failed to ensure lease exists, will retry in 1.6s, error: Get "https://145.40.67.89:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.2-a-5e41ede811?timeout=10s": dial tcp 145.40.67.89:6443: connect: connection refused Feb 13 08:29:34.763379 kernel: audit: type=1400 audit(1707812974.419:662): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.763409 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 08:29:34.763423 kernel: audit: type=1400 audit(1707812974.419:663): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.790126 kernel: audit: audit_lost=1 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 08:29:34.790154 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 08:29:34.790169 kernel: audit: audit_lost=2 audit_rate_limit=0 audit_backlog_limit=64 Feb 13 08:29:34.839662 kubelet[2180]: I0213 08:29:34.839605 2180 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:34.839727 kubelet[2180]: E0213 08:29:34.839720 2180 kubelet_node_status.go:92] "Unable to register node with API server" err="Post \"https://145.40.67.89:6443/api/v1/nodes\": dial tcp 145.40.67.89:6443: connect: connection refused" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.419000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit: BPF prog-id=79 op=LOAD Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c0001c7c48 a2=10 a3=1c items=0 ppid=2283 pid=2392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.571000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353835353939613233353238303464353563363833393664616266 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001c76b0 a2=3c a3=8 items=0 ppid=2283 pid=2392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.571000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353835353939613233353238303464353563363833393664616266 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.572000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.571000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit: BPF prog-id=80 op=LOAD Feb 13 08:29:34.571000 audit: BPF prog-id=81 op=LOAD Feb 13 08:29:34.571000 audit[2392]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001c79d8 a2=78 a3=c000118420 items=0 ppid=2283 pid=2392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.571000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353835353939613233353238303464353563363833393664616266 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c0001bdc48 a2=10 a3=1c items=0 ppid=2294 pid=2411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435643062666264326337656131333636333638303963393037333237 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001bd6b0 a2=3c a3=8 items=0 ppid=2294 pid=2411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435643062666264326337656131333636333638303963393037333237 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.698000 audit[2392]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c0001c7770 a2=78 a3=c000118468 items=0 ppid=2283 pid=2392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353835353939613233353238303464353563363833393664616266 Feb 13 08:29:34.852000 audit: BPF prog-id=82 op=UNLOAD Feb 13 08:29:34.698000 audit[2411]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bd9d8 a2=78 a3=c0002efdc0 items=0 ppid=2294 pid=2411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435643062666264326337656131333636333638303963393037333237 Feb 13 08:29:34.852000 audit: BPF prog-id=81 op=UNLOAD Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { perfmon } for pid=2392 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit[2392]: AVC avc: denied { bpf } for pid=2392 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.852000 audit: BPF prog-id=84 op=LOAD Feb 13 08:29:34.852000 audit: BPF prog-id=85 op=LOAD Feb 13 08:29:34.852000 audit[2411]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c0001bd770 a2=78 a3=c0002efe08 items=0 ppid=2294 pid=2411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435643062666264326337656131333636333638303963393037333237 Feb 13 08:29:34.852000 audit[2392]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001c7c30 a2=78 a3=c000118878 items=0 ppid=2283 pid=2392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361353835353939613233353238303464353563363833393664616266 Feb 13 08:29:34.933000 audit: BPF prog-id=84 op=UNLOAD Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit: BPF prog-id=83 op=UNLOAD Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { perfmon } for pid=2411 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit: BPF prog-id=86 op=LOAD Feb 13 08:29:34.933000 audit[2411]: AVC avc: denied { bpf } for pid=2411 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit: BPF prog-id=87 op=LOAD Feb 13 08:29:34.933000 audit[2411]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bdc30 a2=78 a3=c0003ce218 items=0 ppid=2294 pid=2411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435643062666264326337656131333636333638303963393037333237 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2272 pid=2412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323166616635653839333964636264653435663132663933376666 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001976b0 a2=3c a3=8 items=0 ppid=2272 pid=2412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323166616635653839333964636264653435663132663933376666 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.933000 audit: BPF prog-id=88 op=LOAD Feb 13 08:29:34.933000 audit[2412]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001979d8 a2=78 a3=c000099990 items=0 ppid=2272 pid=2412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323166616635653839333964636264653435663132663933376666 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit: BPF prog-id=89 op=LOAD Feb 13 08:29:34.934000 audit[2412]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000197770 a2=78 a3=c0000999d8 items=0 ppid=2272 pid=2412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323166616635653839333964636264653435663132663933376666 Feb 13 08:29:34.934000 audit: BPF prog-id=89 op=UNLOAD Feb 13 08:29:34.934000 audit: BPF prog-id=88 op=UNLOAD Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { perfmon } for pid=2412 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit[2412]: AVC avc: denied { bpf } for pid=2412 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:34.934000 audit: BPF prog-id=90 op=LOAD Feb 13 08:29:34.934000 audit[2412]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000197c30 a2=78 a3=c000099de8 items=0 ppid=2272 pid=2412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:34.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323166616635653839333964636264653435663132663933376666 Feb 13 08:29:34.951692 env[1475]: time="2024-02-13T08:29:34.951660048Z" level=info msg="StartContainer for \"be21faf5e8939dcbde45f12f937ff4ab89d6977c9ac7f7a7a236b8a2e23c92bb\" returns successfully" Feb 13 08:29:34.951961 env[1475]: time="2024-02-13T08:29:34.951714891Z" level=info msg="StartContainer for \"45d0bfbd2c7ea136636809c907327b4cb4539c19867520608208227122b8d74a\" returns successfully" Feb 13 08:29:34.951961 env[1475]: time="2024-02-13T08:29:34.951773196Z" level=info msg="StartContainer for \"3a585599a2352804d55c68396dabfa4619ac713c50830c95d5b78c2dd7d16f8a\" returns successfully" Feb 13 08:29:35.602000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.602000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.602000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=8 a1=c0003a74d0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:35.602000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:35.602000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=9 a1=c00012e360 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:35.602000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:35.884000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.884000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.884000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=44 a1=c000778000 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:29:35.884000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=43 a1=c002dd2060 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:29:35.884000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:29:35.884000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:29:35.885000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.885000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=43 a1=c00b70a000 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:29:35.885000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:29:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=49 a1=c003dda000 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:29:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:29:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=55 a1=c002d94ec0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:29:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:29:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=5a a1=c00b4cf7d0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:29:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:29:36.338174 kubelet[2180]: E0213 08:29:36.338079 2180 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3510.3.2-a-5e41ede811\" not found" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:36.444087 kubelet[2180]: I0213 08:29:36.443987 2180 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:36.933424 kubelet[2180]: I0213 08:29:36.933363 2180 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:36.964907 kubelet[2180]: E0213 08:29:36.964821 2180 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:37.065512 kubelet[2180]: E0213 08:29:37.065444 2180 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:37.165925 kubelet[2180]: E0213 08:29:37.165847 2180 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:37.266751 kubelet[2180]: E0213 08:29:37.266623 2180 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:37.366843 kubelet[2180]: E0213 08:29:37.366787 2180 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:37.467382 kubelet[2180]: E0213 08:29:37.467289 2180 kubelet_node_status.go:458] "Error getting the current node from lister" err="node \"ci-3510.3.2-a-5e41ede811\" not found" Feb 13 08:29:38.328531 kubelet[2180]: I0213 08:29:38.328420 2180 apiserver.go:52] "Watching apiserver" Feb 13 08:29:38.427681 kubelet[2180]: I0213 08:29:38.427575 2180 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 08:29:38.457149 kubelet[2180]: I0213 08:29:38.457035 2180 reconciler.go:41] "Reconciler: start to sync state" Feb 13 08:29:39.262322 systemd[1]: Reloading. Feb 13 08:29:39.345726 /usr/lib/systemd/system-generators/torcx-generator[2554]: time="2024-02-13T08:29:39Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.2 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.2 /var/lib/torcx/store]" Feb 13 08:29:39.345784 /usr/lib/systemd/system-generators/torcx-generator[2554]: time="2024-02-13T08:29:39Z" level=info msg="torcx already run" Feb 13 08:29:39.443031 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Feb 13 08:29:39.443042 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 13 08:29:39.458913 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.535116 kernel: kauditd_printk_skb: 193 callbacks suppressed Feb 13 08:29:39.535174 kernel: audit: type=1400 audit(1707812979.507:719): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.658411 kernel: audit: type=1400 audit(1707812979.507:720): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.658447 kernel: audit: type=1400 audit(1707812979.507:721): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.719859 kernel: audit: type=1400 audit(1707812979.507:722): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.781677 kernel: audit: type=1400 audit(1707812979.507:723): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.844180 kernel: audit: type=1400 audit(1707812979.507:724): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.969377 kernel: audit: type=1400 audit(1707812979.507:725): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.969406 kernel: audit: type=1400 audit(1707812979.507:726): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031763 kernel: audit: type=1400 audit(1707812979.507:727): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.507000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.093766 kernel: audit: type=1400 audit(1707812979.656:728): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit: BPF prog-id=91 op=LOAD Feb 13 08:29:39.656000 audit: BPF prog-id=80 op=UNLOAD Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.656000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.779000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.779000 audit: BPF prog-id=92 op=LOAD Feb 13 08:29:39.779000 audit: BPF prog-id=71 op=UNLOAD Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.780000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.904000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.904000 audit: BPF prog-id=93 op=LOAD Feb 13 08:29:39.904000 audit: BPF prog-id=52 op=UNLOAD Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:39.906000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.029000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.029000 audit: BPF prog-id=94 op=LOAD Feb 13 08:29:40.029000 audit: BPF prog-id=53 op=UNLOAD Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.031000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.082000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/opt/libexec/kubernetes/kubelet-plugins/volume/exec" dev="sdb9" ino=525104 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:usr_t:s0 tclass=dir permissive=0 Feb 13 08:29:40.082000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=a a1=c000934e80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:40.082000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit: BPF prog-id=95 op=LOAD Feb 13 08:29:40.154000 audit: BPF prog-id=54 op=UNLOAD Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit: BPF prog-id=96 op=LOAD Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit: BPF prog-id=97 op=LOAD Feb 13 08:29:40.154000 audit: BPF prog-id=55 op=UNLOAD Feb 13 08:29:40.154000 audit: BPF prog-id=56 op=UNLOAD Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.154000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit: BPF prog-id=98 op=LOAD Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit: BPF prog-id=99 op=LOAD Feb 13 08:29:40.155000 audit: BPF prog-id=57 op=UNLOAD Feb 13 08:29:40.155000 audit: BPF prog-id=58 op=UNLOAD Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.155000 audit: BPF prog-id=100 op=LOAD Feb 13 08:29:40.155000 audit: BPF prog-id=79 op=UNLOAD Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit: BPF prog-id=101 op=LOAD Feb 13 08:29:40.156000 audit: BPF prog-id=59 op=UNLOAD Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit: BPF prog-id=102 op=LOAD Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit: BPF prog-id=103 op=LOAD Feb 13 08:29:40.156000 audit: BPF prog-id=60 op=UNLOAD Feb 13 08:29:40.156000 audit: BPF prog-id=61 op=UNLOAD Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.156000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit: BPF prog-id=104 op=LOAD Feb 13 08:29:40.157000 audit: BPF prog-id=62 op=UNLOAD Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit: BPF prog-id=105 op=LOAD Feb 13 08:29:40.157000 audit: BPF prog-id=75 op=UNLOAD Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.157000 audit: BPF prog-id=106 op=LOAD Feb 13 08:29:40.157000 audit: BPF prog-id=63 op=UNLOAD Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit: BPF prog-id=107 op=LOAD Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit: BPF prog-id=108 op=LOAD Feb 13 08:29:40.158000 audit: BPF prog-id=64 op=UNLOAD Feb 13 08:29:40.158000 audit: BPF prog-id=65 op=UNLOAD Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit: BPF prog-id=109 op=LOAD Feb 13 08:29:40.158000 audit: BPF prog-id=67 op=UNLOAD Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.158000 audit: BPF prog-id=110 op=LOAD Feb 13 08:29:40.158000 audit: BPF prog-id=66 op=UNLOAD Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.159000 audit: BPF prog-id=111 op=LOAD Feb 13 08:29:40.159000 audit: BPF prog-id=86 op=UNLOAD Feb 13 08:29:40.167284 systemd[1]: Stopping kubelet.service... Feb 13 08:29:40.167403 kubelet[2180]: I0213 08:29:40.167287 2180 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 08:29:40.194256 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 08:29:40.194360 systemd[1]: Stopped kubelet.service. Feb 13 08:29:40.193000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:40.195263 systemd[1]: Started kubelet.service. Feb 13 08:29:40.194000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:40.217687 kubelet[2614]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:29:40.217687 kubelet[2614]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:29:40.217896 kubelet[2614]: I0213 08:29:40.217715 2614 server.go:198] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 08:29:40.218526 kubelet[2614]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.27. Image garbage collector will get sandbox image information from CRI. Feb 13 08:29:40.218526 kubelet[2614]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 08:29:40.220824 kubelet[2614]: I0213 08:29:40.220767 2614 server.go:412] "Kubelet version" kubeletVersion="v1.26.5" Feb 13 08:29:40.220824 kubelet[2614]: I0213 08:29:40.220783 2614 server.go:414] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 08:29:40.221131 kubelet[2614]: I0213 08:29:40.221113 2614 server.go:836] "Client rotation is on, will bootstrap in background" Feb 13 08:29:40.221839 kubelet[2614]: I0213 08:29:40.221832 2614 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 08:29:40.222242 kubelet[2614]: I0213 08:29:40.222204 2614 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 08:29:40.240227 kubelet[2614]: I0213 08:29:40.240182 2614 server.go:659] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 08:29:40.240316 kubelet[2614]: I0213 08:29:40.240306 2614 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 08:29:40.240376 kubelet[2614]: I0213 08:29:40.240353 2614 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={RuntimeCgroupsName: SystemCgroupsName: KubeletCgroupsName: KubeletOOMScoreAdj:-999 ContainerRuntime: CgroupsPerQOS:true CgroupRoot:/ CgroupDriver:systemd KubeletRootDir:/var/lib/kubelet ProtectKernelDefaults:false NodeAllocatableConfig:{KubeReservedCgroupName: SystemReservedCgroupName: ReservedSystemCPUs: EnforceNodeAllocatable:map[pods:{}] KubeReserved:map[] SystemReserved:map[] HardEvictionThresholds:[{Signal:nodefs.available Operator:LessThan Value:{Quantity: Percentage:0.1} GracePeriod:0s MinReclaim:} {Signal:nodefs.inodesFree Operator:LessThan Value:{Quantity: Percentage:0.05} GracePeriod:0s MinReclaim:} {Signal:imagefs.available Operator:LessThan Value:{Quantity: Percentage:0.15} GracePeriod:0s MinReclaim:} {Signal:memory.available Operator:LessThan Value:{Quantity:100Mi Percentage:0} GracePeriod:0s MinReclaim:}]} QOSReserved:map[] CPUManagerPolicy:none CPUManagerPolicyOptions:map[] ExperimentalTopologyManagerScope:container CPUManagerReconcilePeriod:10s ExperimentalMemoryManagerPolicy:None ExperimentalMemoryManagerReservedMemory:[] ExperimentalPodPidsLimit:-1 EnforceCPULimits:true CPUCFSQuotaPeriod:100ms ExperimentalTopologyManagerPolicy:none ExperimentalTopologyManagerPolicyOptions:map[]} Feb 13 08:29:40.240376 kubelet[2614]: I0213 08:29:40.240366 2614 topology_manager.go:134] "Creating topology manager with policy per scope" topologyPolicyName="none" topologyScopeName="container" Feb 13 08:29:40.240376 kubelet[2614]: I0213 08:29:40.240374 2614 container_manager_linux.go:308] "Creating device plugin manager" Feb 13 08:29:40.240472 kubelet[2614]: I0213 08:29:40.240395 2614 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:29:40.242093 kubelet[2614]: I0213 08:29:40.242059 2614 kubelet.go:398] "Attempting to sync node with API server" Feb 13 08:29:40.242093 kubelet[2614]: I0213 08:29:40.242074 2614 kubelet.go:286] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 08:29:40.242093 kubelet[2614]: I0213 08:29:40.242088 2614 kubelet.go:297] "Adding apiserver pod source" Feb 13 08:29:40.242176 kubelet[2614]: I0213 08:29:40.242097 2614 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 08:29:40.242402 kubelet[2614]: I0213 08:29:40.242355 2614 kuberuntime_manager.go:244] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Feb 13 08:29:40.242650 kubelet[2614]: I0213 08:29:40.242643 2614 server.go:1186] "Started kubelet" Feb 13 08:29:40.242776 kubelet[2614]: I0213 08:29:40.242762 2614 server.go:161] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 08:29:40.242950 kubelet[2614]: E0213 08:29:40.242934 2614 cri_stats_provider.go:455] "Failed to get the info of the filesystem with mountpoint" err="unable to find data in memory cache" mountpoint="/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs" Feb 13 08:29:40.242988 kubelet[2614]: E0213 08:29:40.242959 2614 kubelet.go:1386] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 08:29:40.242000 audit[2614]: AVC avc: denied { mac_admin } for pid=2614 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.242000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:29:40.242000 audit[2614]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0006155c0 a1=c00005d428 a2=c000615590 a3=25 items=0 ppid=1 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:40.242000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:29:40.242000 audit[2614]: AVC avc: denied { mac_admin } for pid=2614 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.242000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:29:40.242000 audit[2614]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c001630000 a1=c001632000 a2=c001624060 a3=25 items=0 ppid=1 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:40.242000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:29:40.244432 kubelet[2614]: I0213 08:29:40.244128 2614 kubelet.go:1341] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Feb 13 08:29:40.244432 kubelet[2614]: I0213 08:29:40.244340 2614 kubelet.go:1345] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Feb 13 08:29:40.244432 kubelet[2614]: I0213 08:29:40.244364 2614 server.go:451] "Adding debug handlers to kubelet server" Feb 13 08:29:40.244432 kubelet[2614]: I0213 08:29:40.244366 2614 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 08:29:40.244567 kubelet[2614]: I0213 08:29:40.244457 2614 volume_manager.go:293] "Starting Kubelet Volume Manager" Feb 13 08:29:40.244567 kubelet[2614]: I0213 08:29:40.244487 2614 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 08:29:40.257166 kubelet[2614]: I0213 08:29:40.257147 2614 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv4 Feb 13 08:29:40.263973 kubelet[2614]: I0213 08:29:40.263957 2614 kubelet_network_linux.go:63] "Initialized iptables rules." protocol=IPv6 Feb 13 08:29:40.263973 kubelet[2614]: I0213 08:29:40.263971 2614 status_manager.go:176] "Starting to sync pod status with apiserver" Feb 13 08:29:40.264063 kubelet[2614]: I0213 08:29:40.264000 2614 kubelet.go:2113] "Starting kubelet main sync loop" Feb 13 08:29:40.264063 kubelet[2614]: E0213 08:29:40.264035 2614 kubelet.go:2137] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 08:29:40.266586 kubelet[2614]: I0213 08:29:40.266574 2614 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 08:29:40.266586 kubelet[2614]: I0213 08:29:40.266583 2614 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 08:29:40.266668 kubelet[2614]: I0213 08:29:40.266592 2614 state_mem.go:36] "Initialized new in-memory state store" Feb 13 08:29:40.266692 kubelet[2614]: I0213 08:29:40.266680 2614 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 08:29:40.266692 kubelet[2614]: I0213 08:29:40.266688 2614 state_mem.go:96] "Updated CPUSet assignments" assignments=map[] Feb 13 08:29:40.266692 kubelet[2614]: I0213 08:29:40.266691 2614 policy_none.go:49] "None policy: Start" Feb 13 08:29:40.266905 kubelet[2614]: I0213 08:29:40.266899 2614 memory_manager.go:169] "Starting memorymanager" policy="None" Feb 13 08:29:40.266925 kubelet[2614]: I0213 08:29:40.266909 2614 state_mem.go:35] "Initializing new in-memory state store" Feb 13 08:29:40.266993 kubelet[2614]: I0213 08:29:40.266988 2614 state_mem.go:75] "Updated machine memory state" Feb 13 08:29:40.268874 kubelet[2614]: I0213 08:29:40.268868 2614 manager.go:455] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 08:29:40.266000 audit[2614]: AVC avc: denied { mac_admin } for pid=2614 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:40.266000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Feb 13 08:29:40.266000 audit[2614]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c0018b8810 a1=c000d9b368 a2=c0018b87e0 a3=25 items=0 ppid=1 pid=2614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/opt/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:40.266000 audit: PROCTITLE proctitle=2F6F70742F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Feb 13 08:29:40.269043 kubelet[2614]: I0213 08:29:40.268900 2614 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Feb 13 08:29:40.269066 kubelet[2614]: I0213 08:29:40.269051 2614 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 08:29:40.346470 kubelet[2614]: I0213 08:29:40.346412 2614 kubelet_node_status.go:70] "Attempting to register node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.351033 kubelet[2614]: I0213 08:29:40.350995 2614 kubelet_node_status.go:108] "Node was previously registered" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.351033 kubelet[2614]: I0213 08:29:40.351033 2614 kubelet_node_status.go:73] "Successfully registered node" node="ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.364134 kubelet[2614]: I0213 08:29:40.364094 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:40.364134 kubelet[2614]: I0213 08:29:40.364134 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:40.364216 kubelet[2614]: I0213 08:29:40.364152 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:40.367625 kubelet[2614]: E0213 08:29:40.367579 2614 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-3510.3.2-a-5e41ede811\" already exists" pod="kube-system/kube-scheduler-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.458859 kubelet[2614]: E0213 08:29:40.458816 2614 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546186 kubelet[2614]: I0213 08:29:40.546094 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/15bd00ad85baf2935512d2bdf5998a02-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" (UID: \"15bd00ad85baf2935512d2bdf5998a02\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546186 kubelet[2614]: I0213 08:29:40.546117 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-ca-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546186 kubelet[2614]: I0213 08:29:40.546132 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546186 kubelet[2614]: I0213 08:29:40.546145 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546354 kubelet[2614]: I0213 08:29:40.546199 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/85d9a141d1ad4632795a3a88f1cc049e-kubeconfig\") pod \"kube-scheduler-ci-3510.3.2-a-5e41ede811\" (UID: \"85d9a141d1ad4632795a3a88f1cc049e\") " pod="kube-system/kube-scheduler-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546354 kubelet[2614]: I0213 08:29:40.546218 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/15bd00ad85baf2935512d2bdf5998a02-ca-certs\") pod \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" (UID: \"15bd00ad85baf2935512d2bdf5998a02\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546354 kubelet[2614]: I0213 08:29:40.546230 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/15bd00ad85baf2935512d2bdf5998a02-k8s-certs\") pod \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" (UID: \"15bd00ad85baf2935512d2bdf5998a02\") " pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546354 kubelet[2614]: I0213 08:29:40.546245 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.546354 kubelet[2614]: I0213 08:29:40.546257 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/da9509f303d934bf8e6e4f732dc10d6f-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" (UID: \"da9509f303d934bf8e6e4f732dc10d6f\") " pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:40.939000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:40.939000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000728180 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:40.939000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:40.941000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:40.941000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0007281c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:40.941000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:40.944000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:40.944000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000570240 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:40.944000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:40.947000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:29:40.947000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000bfcd80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:29:40.947000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:29:41.243481 kubelet[2614]: I0213 08:29:41.243257 2614 apiserver.go:52] "Watching apiserver" Feb 13 08:29:41.345760 kubelet[2614]: I0213 08:29:41.345659 2614 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 08:29:41.353051 kubelet[2614]: I0213 08:29:41.352997 2614 reconciler.go:41] "Reconciler: start to sync state" Feb 13 08:29:41.646312 kubelet[2614]: E0213 08:29:41.646262 2614 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-3510.3.2-a-5e41ede811\" already exists" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:41.846427 kubelet[2614]: E0213 08:29:41.846376 2614 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-3510.3.2-a-5e41ede811\" already exists" pod="kube-system/kube-scheduler-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:42.052190 kubelet[2614]: E0213 08:29:42.052096 2614 kubelet.go:1802] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.2-a-5e41ede811\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" Feb 13 08:29:42.251585 kubelet[2614]: I0213 08:29:42.251563 2614 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510.3.2-a-5e41ede811" podStartSLOduration=2.251527721 pod.CreationTimestamp="2024-02-13 08:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:29:42.251274203 +0000 UTC m=+2.054166034" watchObservedRunningTime="2024-02-13 08:29:42.251527721 +0000 UTC m=+2.054419551" Feb 13 08:29:43.052357 kubelet[2614]: I0213 08:29:43.052302 2614 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510.3.2-a-5e41ede811" podStartSLOduration=5.052260005 pod.CreationTimestamp="2024-02-13 08:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:29:43.051998308 +0000 UTC m=+2.854890138" watchObservedRunningTime="2024-02-13 08:29:43.052260005 +0000 UTC m=+2.855151836" Feb 13 08:29:43.052473 kubelet[2614]: I0213 08:29:43.052423 2614 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510.3.2-a-5e41ede811" podStartSLOduration=5.052414071 pod.CreationTimestamp="2024-02-13 08:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:29:42.653881929 +0000 UTC m=+2.456773835" watchObservedRunningTime="2024-02-13 08:29:43.052414071 +0000 UTC m=+2.855305898" Feb 13 08:29:45.388263 sudo[1630]: pam_unix(sudo:session): session closed for user root Feb 13 08:29:45.386000 audit[1630]: USER_END pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:45.389094 sshd[1627]: pam_unix(sshd:session): session closed for user core Feb 13 08:29:45.414295 kernel: kauditd_printk_skb: 263 callbacks suppressed Feb 13 08:29:45.414327 kernel: audit: type=1106 audit(1707812985.386:973): pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:45.390472 systemd[1]: sshd@6-145.40.67.89:22-139.178.68.195:42274.service: Deactivated successfully. Feb 13 08:29:45.390944 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 08:29:45.391047 systemd[1]: session-9.scope: Consumed 2.741s CPU time. Feb 13 08:29:45.391437 systemd-logind[1463]: Session 9 logged out. Waiting for processes to exit. Feb 13 08:29:45.391944 systemd-logind[1463]: Removed session 9. Feb 13 08:29:45.386000 audit[1630]: CRED_DISP pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:45.585492 kernel: audit: type=1104 audit(1707812985.386:974): pid=1630 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Feb 13 08:29:45.585521 kernel: audit: type=1106 audit(1707812985.387:975): pid=1627 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:45.387000 audit[1627]: USER_END pid=1627 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:45.387000 audit[1627]: CRED_DISP pid=1627 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:45.766249 kernel: audit: type=1104 audit(1707812985.387:976): pid=1627 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:29:45.766279 kernel: audit: type=1131 audit(1707812985.388:977): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.67.89:22-139.178.68.195:42274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:45.388000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-145.40.67.89:22-139.178.68.195:42274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:46.685096 systemd[1]: Started sshd@7-145.40.67.89:22-161.35.108.241:46094.service. Feb 13 08:29:46.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.89:22-161.35.108.241:46094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:46.775145 kernel: audit: type=1130 audit(1707812986.683:978): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.89:22-161.35.108.241:46094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:47.105024 sshd[2797]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:29:47.103000 audit[2797]: USER_AUTH pid=2797 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:29:47.195137 kernel: audit: type=1100 audit(1707812987.103:979): pid=2797 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:29:49.136645 sshd[2797]: Failed password for root from 161.35.108.241 port 46094 ssh2 Feb 13 08:29:49.518054 update_engine[1465]: I0213 08:29:49.517918 1465 update_attempter.cc:509] Updating boot flags... Feb 13 08:29:50.027539 sshd[2797]: Received disconnect from 161.35.108.241 port 46094:11: Bye Bye [preauth] Feb 13 08:29:50.027539 sshd[2797]: Disconnected from authenticating user root 161.35.108.241 port 46094 [preauth] Feb 13 08:29:50.030044 systemd[1]: sshd@7-145.40.67.89:22-161.35.108.241:46094.service: Deactivated successfully. Feb 13 08:29:50.029000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.89:22-161.35.108.241:46094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:50.120935 kernel: audit: type=1131 audit(1707812990.029:980): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-145.40.67.89:22-161.35.108.241:46094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:51.689993 kubelet[2614]: I0213 08:29:51.689945 2614 kuberuntime_manager.go:1114] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 08:29:51.690228 env[1475]: time="2024-02-13T08:29:51.690130282Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 08:29:51.690337 kubelet[2614]: I0213 08:29:51.690257 2614 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 08:29:52.473663 kubelet[2614]: I0213 08:29:52.473646 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:52.477518 systemd[1]: Created slice kubepods-besteffort-pod2bfff88a_73ed_4c8e_b58d_4d4daaa82672.slice. Feb 13 08:29:52.636255 kubelet[2614]: I0213 08:29:52.636184 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2bfff88a-73ed-4c8e-b58d-4d4daaa82672-kube-proxy\") pod \"kube-proxy-t4wv2\" (UID: \"2bfff88a-73ed-4c8e-b58d-4d4daaa82672\") " pod="kube-system/kube-proxy-t4wv2" Feb 13 08:29:52.636542 kubelet[2614]: I0213 08:29:52.636334 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2bfff88a-73ed-4c8e-b58d-4d4daaa82672-xtables-lock\") pod \"kube-proxy-t4wv2\" (UID: \"2bfff88a-73ed-4c8e-b58d-4d4daaa82672\") " pod="kube-system/kube-proxy-t4wv2" Feb 13 08:29:52.636773 kubelet[2614]: I0213 08:29:52.636575 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2bfff88a-73ed-4c8e-b58d-4d4daaa82672-lib-modules\") pod \"kube-proxy-t4wv2\" (UID: \"2bfff88a-73ed-4c8e-b58d-4d4daaa82672\") " pod="kube-system/kube-proxy-t4wv2" Feb 13 08:29:52.636773 kubelet[2614]: I0213 08:29:52.636712 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fx7t\" (UniqueName: \"kubernetes.io/projected/2bfff88a-73ed-4c8e-b58d-4d4daaa82672-kube-api-access-8fx7t\") pod \"kube-proxy-t4wv2\" (UID: \"2bfff88a-73ed-4c8e-b58d-4d4daaa82672\") " pod="kube-system/kube-proxy-t4wv2" Feb 13 08:29:52.704329 kubelet[2614]: I0213 08:29:52.704263 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:52.714901 systemd[1]: Created slice kubepods-besteffort-pod3fc1226e_2256_4168_8a4f_4e28a797bb67.slice. Feb 13 08:29:52.796306 env[1475]: time="2024-02-13T08:29:52.796064968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t4wv2,Uid:2bfff88a-73ed-4c8e-b58d-4d4daaa82672,Namespace:kube-system,Attempt:0,}" Feb 13 08:29:52.821261 env[1475]: time="2024-02-13T08:29:52.821028374Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:52.821261 env[1475]: time="2024-02-13T08:29:52.821124974Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:52.821261 env[1475]: time="2024-02-13T08:29:52.821164420Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:52.821817 env[1475]: time="2024-02-13T08:29:52.821588060Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/de48d762b6166e99dc63a1f2a99fbfcfbe7cbf705ef35150f9ffd60c8bb8a758 pid=2826 runtime=io.containerd.runc.v2 Feb 13 08:29:52.838529 kubelet[2614]: I0213 08:29:52.838460 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3fc1226e-2256-4168-8a4f-4e28a797bb67-var-lib-calico\") pod \"tigera-operator-cfc98749c-rh6gg\" (UID: \"3fc1226e-2256-4168-8a4f-4e28a797bb67\") " pod="tigera-operator/tigera-operator-cfc98749c-rh6gg" Feb 13 08:29:52.838880 kubelet[2614]: I0213 08:29:52.838776 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvk5b\" (UniqueName: \"kubernetes.io/projected/3fc1226e-2256-4168-8a4f-4e28a797bb67-kube-api-access-nvk5b\") pod \"tigera-operator-cfc98749c-rh6gg\" (UID: \"3fc1226e-2256-4168-8a4f-4e28a797bb67\") " pod="tigera-operator/tigera-operator-cfc98749c-rh6gg" Feb 13 08:29:52.854342 systemd[1]: Started cri-containerd-de48d762b6166e99dc63a1f2a99fbfcfbe7cbf705ef35150f9ffd60c8bb8a758.scope. Feb 13 08:29:52.868000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.868000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.998004 kernel: audit: type=1400 audit(1707812992.868:981): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.998056 kernel: audit: type=1400 audit(1707812992.868:982): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.998074 kernel: audit: type=1400 audit(1707812992.868:983): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.868000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.019387 env[1475]: time="2024-02-13T08:29:53.019327508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-rh6gg,Uid:3fc1226e-2256-4168-8a4f-4e28a797bb67,Namespace:tigera-operator,Attempt:0,}" Feb 13 08:29:53.026039 env[1475]: time="2024-02-13T08:29:53.025965173Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:53.026039 env[1475]: time="2024-02-13T08:29:53.025985102Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:53.026039 env[1475]: time="2024-02-13T08:29:53.025991988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:53.026130 env[1475]: time="2024-02-13T08:29:53.026047031Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/530ec53a771d588d54d123622765f4a6dcd9a6cc36d085e6aff90de2f03b65c2 pid=2861 runtime=io.containerd.runc.v2 Feb 13 08:29:53.031667 systemd[1]: Started cri-containerd-530ec53a771d588d54d123622765f4a6dcd9a6cc36d085e6aff90de2f03b65c2.scope. Feb 13 08:29:52.868000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.126752 kernel: audit: type=1400 audit(1707812992.868:984): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.126789 kernel: audit: type=1400 audit(1707812992.869:985): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.869000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.191232 kernel: audit: type=1400 audit(1707812992.869:986): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.869000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.255715 kernel: audit: type=1400 audit(1707812992.869:987): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.869000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.320159 kernel: audit: type=1400 audit(1707812992.869:988): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.869000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.384640 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 08:29:53.384673 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Feb 13 08:29:52.869000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.996000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.996000 audit: BPF prog-id=112 op=LOAD Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=2826 pid=2836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:52.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465343864373632623631363665393964633633613166326139396662 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=2826 pid=2836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:52.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465343864373632623631363665393964633633613166326139396662 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.132000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:52.997000 audit: BPF prog-id=113 op=LOAD Feb 13 08:29:52.997000 audit[2836]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c00020a560 items=0 ppid=2826 pid=2836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:52.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465343864373632623631363665393964633633613166326139396662 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: SYSCALL arch=c000003e syscall=321 success=no exit=-11 a0=5 a1=c000197770 a2=78 a3=c00020a858 items=0 ppid=2826 pid=2836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.253000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465343864373632623631363665393964633633613166326139396662 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit: BPF prog-id=114 op=LOAD Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000147c48 a2=10 a3=1c items=0 ppid=2861 pid=2872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533306563353361373731643538386435346431323336323237363566 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001476b0 a2=3c a3=c items=0 ppid=2861 pid=2872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533306563353361373731643538386435346431323336323237363566 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.253000 audit[2836]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=1 items=0 ppid=2826 pid=2836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.253000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465343864373632623631363665393964633633613166326139396662 Feb 13 08:29:53.383000 audit: BPF prog-id=115 op=UNLOAD Feb 13 08:29:53.383000 audit: BPF prog-id=113 op=UNLOAD Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { perfmon } for pid=2836 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: AVC avc: denied { bpf } for pid=2836 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit: BPF prog-id=116 op=LOAD Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit[2836]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c00020ac68 items=0 ppid=2826 pid=2836 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.383000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465343864373632623631363665393964633633613166326139396662 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.319000 audit: BPF prog-id=117 op=LOAD Feb 13 08:29:53.319000 audit[2872]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001479d8 a2=78 a3=c0003e84c0 items=0 ppid=2861 pid=2872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533306563353361373731643538386435346431323336323237363566 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit: BPF prog-id=118 op=LOAD Feb 13 08:29:53.438000 audit[2872]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000147770 a2=78 a3=c0003e8508 items=0 ppid=2861 pid=2872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.438000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533306563353361373731643538386435346431323336323237363566 Feb 13 08:29:53.438000 audit: BPF prog-id=118 op=UNLOAD Feb 13 08:29:53.438000 audit: BPF prog-id=117 op=UNLOAD Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { perfmon } for pid=2872 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit[2872]: AVC avc: denied { bpf } for pid=2872 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.438000 audit: BPF prog-id=119 op=LOAD Feb 13 08:29:53.438000 audit[2872]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000147c30 a2=78 a3=c0003e8918 items=0 ppid=2861 pid=2872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.438000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533306563353361373731643538386435346431323336323237363566 Feb 13 08:29:53.443204 env[1475]: time="2024-02-13T08:29:53.443179103Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t4wv2,Uid:2bfff88a-73ed-4c8e-b58d-4d4daaa82672,Namespace:kube-system,Attempt:0,} returns sandbox id \"de48d762b6166e99dc63a1f2a99fbfcfbe7cbf705ef35150f9ffd60c8bb8a758\"" Feb 13 08:29:53.444349 env[1475]: time="2024-02-13T08:29:53.444335745Z" level=info msg="CreateContainer within sandbox \"de48d762b6166e99dc63a1f2a99fbfcfbe7cbf705ef35150f9ffd60c8bb8a758\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 08:29:53.449589 env[1475]: time="2024-02-13T08:29:53.449545480Z" level=info msg="CreateContainer within sandbox \"de48d762b6166e99dc63a1f2a99fbfcfbe7cbf705ef35150f9ffd60c8bb8a758\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7c3785fd9572ed86e55c301ee369e4e0aa7b2efb8e47facbe2a612e2a5c96cde\"" Feb 13 08:29:53.449758 env[1475]: time="2024-02-13T08:29:53.449745810Z" level=info msg="StartContainer for \"7c3785fd9572ed86e55c301ee369e4e0aa7b2efb8e47facbe2a612e2a5c96cde\"" Feb 13 08:29:53.455436 env[1475]: time="2024-02-13T08:29:53.455406354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-cfc98749c-rh6gg,Uid:3fc1226e-2256-4168-8a4f-4e28a797bb67,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"530ec53a771d588d54d123622765f4a6dcd9a6cc36d085e6aff90de2f03b65c2\"" Feb 13 08:29:53.456390 env[1475]: time="2024-02-13T08:29:53.456374598Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\"" Feb 13 08:29:53.457438 systemd[1]: Started cri-containerd-7c3785fd9572ed86e55c301ee369e4e0aa7b2efb8e47facbe2a612e2a5c96cde.scope. Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001456b0 a2=3c a3=7f8e007b2cc8 items=0 ppid=2826 pid=2903 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763333738356664393537326564383665353563333031656533363965 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit: BPF prog-id=120 op=LOAD Feb 13 08:29:53.464000 audit[2903]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001459d8 a2=78 a3=c00027bc58 items=0 ppid=2826 pid=2903 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763333738356664393537326564383665353563333031656533363965 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit: BPF prog-id=121 op=LOAD Feb 13 08:29:53.464000 audit[2903]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000145770 a2=78 a3=c00027bca8 items=0 ppid=2826 pid=2903 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763333738356664393537326564383665353563333031656533363965 Feb 13 08:29:53.464000 audit: BPF prog-id=121 op=UNLOAD Feb 13 08:29:53.464000 audit: BPF prog-id=120 op=UNLOAD Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { perfmon } for pid=2903 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit[2903]: AVC avc: denied { bpf } for pid=2903 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:53.464000 audit: BPF prog-id=122 op=LOAD Feb 13 08:29:53.464000 audit[2903]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000145c30 a2=78 a3=c00027bd38 items=0 ppid=2826 pid=2903 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.464000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763333738356664393537326564383665353563333031656533363965 Feb 13 08:29:53.470820 env[1475]: time="2024-02-13T08:29:53.470769445Z" level=info msg="StartContainer for \"7c3785fd9572ed86e55c301ee369e4e0aa7b2efb8e47facbe2a612e2a5c96cde\" returns successfully" Feb 13 08:29:53.491000 audit[2965]: NETFILTER_CFG table=mangle:59 family=2 entries=1 op=nft_register_chain pid=2965 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.491000 audit[2965]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc8d522380 a2=0 a3=7ffc8d52236c items=0 ppid=2919 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.491000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 08:29:53.491000 audit[2966]: NETFILTER_CFG table=mangle:60 family=10 entries=1 op=nft_register_chain pid=2966 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.491000 audit[2966]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc0920c830 a2=0 a3=7ffc0920c81c items=0 ppid=2919 pid=2966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.491000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Feb 13 08:29:53.492000 audit[2971]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=2971 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.492000 audit[2971]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd7f042090 a2=0 a3=7ffd7f04207c items=0 ppid=2919 pid=2971 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.492000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 08:29:53.492000 audit[2973]: NETFILTER_CFG table=nat:62 family=10 entries=1 op=nft_register_chain pid=2973 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.492000 audit[2973]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd66c3df50 a2=0 a3=7ffd66c3df3c items=0 ppid=2919 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.492000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Feb 13 08:29:53.492000 audit[2974]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=2974 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.492000 audit[2974]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff73234390 a2=0 a3=7fff7323437c items=0 ppid=2919 pid=2974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.492000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 08:29:53.492000 audit[2975]: NETFILTER_CFG table=filter:64 family=10 entries=1 op=nft_register_chain pid=2975 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.492000 audit[2975]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc1eaac200 a2=0 a3=7ffc1eaac1ec items=0 ppid=2919 pid=2975 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.492000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Feb 13 08:29:53.598000 audit[2976]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=2976 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.598000 audit[2976]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd3819b5e0 a2=0 a3=7ffd3819b5cc items=0 ppid=2919 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.598000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 08:29:53.605000 audit[2978]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=2978 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.605000 audit[2978]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff622bc670 a2=0 a3=7fff622bc65c items=0 ppid=2919 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.605000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Feb 13 08:29:53.615000 audit[2981]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=2981 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.615000 audit[2981]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe35ebcc20 a2=0 a3=7ffe35ebcc0c items=0 ppid=2919 pid=2981 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.615000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Feb 13 08:29:53.618000 audit[2982]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=2982 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.618000 audit[2982]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff35f032c0 a2=0 a3=7fff35f032ac items=0 ppid=2919 pid=2982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.618000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 08:29:53.625000 audit[2984]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=2984 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.625000 audit[2984]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc342c83f0 a2=0 a3=7ffc342c83dc items=0 ppid=2919 pid=2984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.625000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 08:29:53.627000 audit[2985]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=2985 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.627000 audit[2985]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd37b0f170 a2=0 a3=7ffd37b0f15c items=0 ppid=2919 pid=2985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.627000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 08:29:53.636000 audit[2987]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=2987 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.636000 audit[2987]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc2d39b400 a2=0 a3=7ffc2d39b3ec items=0 ppid=2919 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.636000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 08:29:53.644000 audit[2990]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=2990 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.644000 audit[2990]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe55e571b0 a2=0 a3=7ffe55e5719c items=0 ppid=2919 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.644000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Feb 13 08:29:53.647000 audit[2991]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_chain pid=2991 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.647000 audit[2991]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc660517d0 a2=0 a3=7ffc660517bc items=0 ppid=2919 pid=2991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.647000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 08:29:53.655000 audit[2993]: NETFILTER_CFG table=filter:74 family=2 entries=1 op=nft_register_rule pid=2993 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.655000 audit[2993]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe0891dac0 a2=0 a3=7ffe0891daac items=0 ppid=2919 pid=2993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.655000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 08:29:53.657000 audit[2994]: NETFILTER_CFG table=filter:75 family=2 entries=1 op=nft_register_chain pid=2994 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.657000 audit[2994]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd2d3b4a10 a2=0 a3=7ffd2d3b49fc items=0 ppid=2919 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.657000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 08:29:53.663000 audit[2996]: NETFILTER_CFG table=filter:76 family=2 entries=1 op=nft_register_rule pid=2996 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.663000 audit[2996]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff9c13cc20 a2=0 a3=7fff9c13cc0c items=0 ppid=2919 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.663000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 08:29:53.673000 audit[2999]: NETFILTER_CFG table=filter:77 family=2 entries=1 op=nft_register_rule pid=2999 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.673000 audit[2999]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff3883c650 a2=0 a3=7fff3883c63c items=0 ppid=2919 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.673000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 08:29:53.683000 audit[3002]: NETFILTER_CFG table=filter:78 family=2 entries=1 op=nft_register_rule pid=3002 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.683000 audit[3002]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffde8dcc960 a2=0 a3=7ffde8dcc94c items=0 ppid=2919 pid=3002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.683000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 08:29:53.686000 audit[3003]: NETFILTER_CFG table=nat:79 family=2 entries=1 op=nft_register_chain pid=3003 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.686000 audit[3003]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff6b7a11e0 a2=0 a3=7fff6b7a11cc items=0 ppid=2919 pid=3003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.686000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 08:29:53.692000 audit[3005]: NETFILTER_CFG table=nat:80 family=2 entries=1 op=nft_register_rule pid=3005 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.692000 audit[3005]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffe48e02e30 a2=0 a3=7ffe48e02e1c items=0 ppid=2919 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.692000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:29:53.701000 audit[3008]: NETFILTER_CFG table=nat:81 family=2 entries=1 op=nft_register_rule pid=3008 subj=system_u:system_r:kernel_t:s0 comm="iptables" Feb 13 08:29:53.701000 audit[3008]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff0eb80ba0 a2=0 a3=7fff0eb80b8c items=0 ppid=2919 pid=3008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.701000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:29:53.727000 audit[3012]: NETFILTER_CFG table=filter:82 family=2 entries=6 op=nft_register_rule pid=3012 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:53.727000 audit[3012]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7fff34ee9940 a2=0 a3=7fff34ee992c items=0 ppid=2919 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.727000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:53.741000 audit[3012]: NETFILTER_CFG table=nat:83 family=2 entries=17 op=nft_register_chain pid=3012 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:53.741000 audit[3012]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7fff34ee9940 a2=0 a3=7fff34ee992c items=0 ppid=2919 pid=3012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.741000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:53.825000 audit[3042]: NETFILTER_CFG table=filter:84 family=2 entries=12 op=nft_register_rule pid=3042 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:53.825000 audit[3042]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffd882aaa10 a2=0 a3=7ffd882aa9fc items=0 ppid=2919 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.825000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:53.827000 audit[3042]: NETFILTER_CFG table=nat:85 family=2 entries=20 op=nft_register_rule pid=3042 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:53.827000 audit[3042]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffd882aaa10 a2=0 a3=7ffd882aa9fc items=0 ppid=2919 pid=3042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.827000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:53.841000 audit[3043]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3043 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.841000 audit[3043]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffdcef46c60 a2=0 a3=7ffdcef46c4c items=0 ppid=2919 pid=3043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.841000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Feb 13 08:29:53.848000 audit[3045]: NETFILTER_CFG table=filter:87 family=10 entries=2 op=nft_register_chain pid=3045 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.848000 audit[3045]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffe853a08a0 a2=0 a3=7ffe853a088c items=0 ppid=2919 pid=3045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.848000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Feb 13 08:29:53.858000 audit[3048]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.858000 audit[3048]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffea2f64bb0 a2=0 a3=7ffea2f64b9c items=0 ppid=2919 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.858000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Feb 13 08:29:53.862000 audit[3049]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3049 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.862000 audit[3049]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffccc6e9860 a2=0 a3=7ffccc6e984c items=0 ppid=2919 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.862000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Feb 13 08:29:53.867000 audit[3051]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3051 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.867000 audit[3051]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcc2b0d8a0 a2=0 a3=7ffcc2b0d88c items=0 ppid=2919 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.867000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Feb 13 08:29:53.871000 audit[3052]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3052 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.871000 audit[3052]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc1c474930 a2=0 a3=7ffc1c47491c items=0 ppid=2919 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.871000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Feb 13 08:29:53.877000 audit[3054]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.877000 audit[3054]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffe509ae0f0 a2=0 a3=7ffe509ae0dc items=0 ppid=2919 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.877000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Feb 13 08:29:53.886000 audit[3057]: NETFILTER_CFG table=filter:93 family=10 entries=2 op=nft_register_chain pid=3057 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.886000 audit[3057]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffc4d2beda0 a2=0 a3=7ffc4d2bed8c items=0 ppid=2919 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.886000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Feb 13 08:29:53.888000 audit[3058]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_chain pid=3058 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.888000 audit[3058]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff9af2a9d0 a2=0 a3=7fff9af2a9bc items=0 ppid=2919 pid=3058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.888000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Feb 13 08:29:53.895000 audit[3060]: NETFILTER_CFG table=filter:95 family=10 entries=1 op=nft_register_rule pid=3060 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.895000 audit[3060]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffee982cfe0 a2=0 a3=7ffee982cfcc items=0 ppid=2919 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.895000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Feb 13 08:29:53.898000 audit[3061]: NETFILTER_CFG table=filter:96 family=10 entries=1 op=nft_register_chain pid=3061 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.898000 audit[3061]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffef8cc1080 a2=0 a3=7ffef8cc106c items=0 ppid=2919 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.898000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Feb 13 08:29:53.904000 audit[3063]: NETFILTER_CFG table=filter:97 family=10 entries=1 op=nft_register_rule pid=3063 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.904000 audit[3063]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff4a83cc40 a2=0 a3=7fff4a83cc2c items=0 ppid=2919 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.904000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Feb 13 08:29:53.912000 audit[3066]: NETFILTER_CFG table=filter:98 family=10 entries=1 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.912000 audit[3066]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff6e6a00d0 a2=0 a3=7fff6e6a00bc items=0 ppid=2919 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.912000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Feb 13 08:29:53.922000 audit[3069]: NETFILTER_CFG table=filter:99 family=10 entries=1 op=nft_register_rule pid=3069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.922000 audit[3069]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd8cb02fc0 a2=0 a3=7ffd8cb02fac items=0 ppid=2919 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.922000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Feb 13 08:29:53.924000 audit[3070]: NETFILTER_CFG table=nat:100 family=10 entries=1 op=nft_register_chain pid=3070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.924000 audit[3070]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffef5265220 a2=0 a3=7ffef526520c items=0 ppid=2919 pid=3070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.924000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Feb 13 08:29:53.930000 audit[3072]: NETFILTER_CFG table=nat:101 family=10 entries=2 op=nft_register_chain pid=3072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.930000 audit[3072]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7fffe3b8fc00 a2=0 a3=7fffe3b8fbec items=0 ppid=2919 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.930000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:29:53.938000 audit[3075]: NETFILTER_CFG table=nat:102 family=10 entries=2 op=nft_register_chain pid=3075 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Feb 13 08:29:53.938000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffc806c2060 a2=0 a3=7ffc806c204c items=0 ppid=2919 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.938000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Feb 13 08:29:53.951000 audit[3079]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 08:29:53.951000 audit[3079]: SYSCALL arch=c000003e syscall=46 success=yes exit=1916 a0=3 a1=7ffea13b7ab0 a2=0 a3=7ffea13b7a9c items=0 ppid=2919 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.951000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:53.952000 audit[3079]: NETFILTER_CFG table=nat:104 family=10 entries=10 op=nft_register_chain pid=3079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Feb 13 08:29:53.952000 audit[3079]: SYSCALL arch=c000003e syscall=46 success=yes exit=1968 a0=3 a1=7ffea13b7ab0 a2=0 a3=7ffea13b7a9c items=0 ppid=2919 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:53.952000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:54.315474 kubelet[2614]: I0213 08:29:54.315381 2614 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-t4wv2" podStartSLOduration=2.31527683 pod.CreationTimestamp="2024-02-13 08:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:29:54.315027307 +0000 UTC m=+14.117919214" watchObservedRunningTime="2024-02-13 08:29:54.31527683 +0000 UTC m=+14.118168711" Feb 13 08:29:57.408478 env[1475]: time="2024-02-13T08:29:57.408454946Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:57.409185 env[1475]: time="2024-02-13T08:29:57.409173616Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:57.410115 env[1475]: time="2024-02-13T08:29:57.410071037Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.32.3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:57.410840 env[1475]: time="2024-02-13T08:29:57.410801929Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:715ac9a30f8a9579e44258af20de354715429e11836b493918e9e1a696e9b028,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:29:57.411273 env[1475]: time="2024-02-13T08:29:57.411221636Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.32.3\" returns image reference \"sha256:7bc79e0d3be4fa8c35133127424f9b1ec775af43145b7dd58637905c76084827\"" Feb 13 08:29:57.412285 env[1475]: time="2024-02-13T08:29:57.412270662Z" level=info msg="CreateContainer within sandbox \"530ec53a771d588d54d123622765f4a6dcd9a6cc36d085e6aff90de2f03b65c2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 08:29:57.417317 env[1475]: time="2024-02-13T08:29:57.417281849Z" level=info msg="CreateContainer within sandbox \"530ec53a771d588d54d123622765f4a6dcd9a6cc36d085e6aff90de2f03b65c2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"fce36ab8d1e31ac1527e23ff7f0e8037cfcaf9a2a53fd07275d464d3fc8f237e\"" Feb 13 08:29:57.417607 env[1475]: time="2024-02-13T08:29:57.417564400Z" level=info msg="StartContainer for \"fce36ab8d1e31ac1527e23ff7f0e8037cfcaf9a2a53fd07275d464d3fc8f237e\"" Feb 13 08:29:57.426119 systemd[1]: Started cri-containerd-fce36ab8d1e31ac1527e23ff7f0e8037cfcaf9a2a53fd07275d464d3fc8f237e.scope. Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.431000 audit: BPF prog-id=123 op=LOAD Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c0001bdc48 a2=10 a3=1c items=0 ppid=2861 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:57.432000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663653336616238643165333161633135323765323366663766306538 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001bd6b0 a2=3c a3=8 items=0 ppid=2861 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:57.432000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663653336616238643165333161633135323765323366663766306538 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit: BPF prog-id=124 op=LOAD Feb 13 08:29:57.432000 audit[3087]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bd9d8 a2=78 a3=c0000247b0 items=0 ppid=2861 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:57.432000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663653336616238643165333161633135323765323366663766306538 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit: BPF prog-id=125 op=LOAD Feb 13 08:29:57.432000 audit[3087]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c0001bd770 a2=78 a3=c0000247f8 items=0 ppid=2861 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:57.432000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663653336616238643165333161633135323765323366663766306538 Feb 13 08:29:57.432000 audit: BPF prog-id=125 op=UNLOAD Feb 13 08:29:57.432000 audit: BPF prog-id=124 op=UNLOAD Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { perfmon } for pid=3087 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit[3087]: AVC avc: denied { bpf } for pid=3087 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:57.432000 audit: BPF prog-id=126 op=LOAD Feb 13 08:29:57.432000 audit[3087]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001bdc30 a2=78 a3=c000024c08 items=0 ppid=2861 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:57.432000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663653336616238643165333161633135323765323366663766306538 Feb 13 08:29:57.439055 env[1475]: time="2024-02-13T08:29:57.439033598Z" level=info msg="StartContainer for \"fce36ab8d1e31ac1527e23ff7f0e8037cfcaf9a2a53fd07275d464d3fc8f237e\" returns successfully" Feb 13 08:29:58.319918 kubelet[2614]: I0213 08:29:58.319898 2614 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-cfc98749c-rh6gg" podStartSLOduration=-9.223372030534897e+09 pod.CreationTimestamp="2024-02-13 08:29:52 +0000 UTC" firstStartedPulling="2024-02-13 08:29:53.456111701 +0000 UTC m=+13.259003537" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:29:58.31974402 +0000 UTC m=+18.122635850" watchObservedRunningTime="2024-02-13 08:29:58.319879097 +0000 UTC m=+18.122770925" Feb 13 08:29:58.695244 systemd[1]: Started sshd@8-145.40.67.89:22-43.153.15.221:59402.service. Feb 13 08:29:58.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.67.89:22-43.153.15.221:59402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:58.722449 kernel: kauditd_printk_skb: 358 callbacks suppressed Feb 13 08:29:58.722533 kernel: audit: type=1130 audit(1707812998.694:1088): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.67.89:22-43.153.15.221:59402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:29:58.881890 sshd[3129]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:29:58.881000 audit[3129]: USER_AUTH pid=3129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:29:58.969968 kernel: audit: type=1100 audit(1707812998.881:1089): pid=3129 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:29:59.179000 audit[3156]: NETFILTER_CFG table=filter:105 family=2 entries=13 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.232524 kubelet[2614]: I0213 08:29:59.232495 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:59.237458 systemd[1]: Created slice kubepods-besteffort-pod3da5b7da_ece4_41b5_af0b_d383926ab9e0.slice. Feb 13 08:29:59.179000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7fffad066f30 a2=0 a3=7fffad066f1c items=0 ppid=2919 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.265723 kubelet[2614]: I0213 08:29:59.265702 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:59.268450 systemd[1]: Created slice kubepods-besteffort-pod8e8b3a92_5cc2_4cc6_af08_c9aed5158746.slice. Feb 13 08:29:59.346267 kernel: audit: type=1325 audit(1707812999.179:1090): table=filter:105 family=2 entries=13 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.346324 kernel: audit: type=1300 audit(1707812999.179:1090): arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7fffad066f30 a2=0 a3=7fffad066f1c items=0 ppid=2919 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.346340 kernel: audit: type=1327 audit(1707812999.179:1090): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:59.179000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:59.366459 kubelet[2614]: I0213 08:29:59.366414 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:29:59.366676 kubelet[2614]: E0213 08:29:59.366580 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:29:59.383164 kubelet[2614]: I0213 08:29:59.383114 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxvv7\" (UniqueName: \"kubernetes.io/projected/3da5b7da-ece4-41b5-af0b-d383926ab9e0-kube-api-access-qxvv7\") pod \"calico-typha-5c7b7d8757-mv8pm\" (UID: \"3da5b7da-ece4-41b5-af0b-d383926ab9e0\") " pod="calico-system/calico-typha-5c7b7d8757-mv8pm" Feb 13 08:29:59.383164 kubelet[2614]: I0213 08:29:59.383146 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-flexvol-driver-host\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383164 kubelet[2614]: I0213 08:29:59.383165 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-tigera-ca-bundle\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383292 kubelet[2614]: I0213 08:29:59.383178 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-node-certs\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383292 kubelet[2614]: I0213 08:29:59.383189 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-var-lib-calico\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383292 kubelet[2614]: I0213 08:29:59.383204 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-cni-bin-dir\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383292 kubelet[2614]: I0213 08:29:59.383215 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-lib-modules\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383292 kubelet[2614]: I0213 08:29:59.383244 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-policysync\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383377 kubelet[2614]: I0213 08:29:59.383290 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-var-run-calico\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383377 kubelet[2614]: I0213 08:29:59.383312 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-cni-net-dir\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383377 kubelet[2614]: I0213 08:29:59.383325 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plpgj\" (UniqueName: \"kubernetes.io/projected/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-kube-api-access-plpgj\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383377 kubelet[2614]: I0213 08:29:59.383351 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-xtables-lock\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.383452 kubelet[2614]: I0213 08:29:59.383378 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da5b7da-ece4-41b5-af0b-d383926ab9e0-tigera-ca-bundle\") pod \"calico-typha-5c7b7d8757-mv8pm\" (UID: \"3da5b7da-ece4-41b5-af0b-d383926ab9e0\") " pod="calico-system/calico-typha-5c7b7d8757-mv8pm" Feb 13 08:29:59.383452 kubelet[2614]: I0213 08:29:59.383405 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/3da5b7da-ece4-41b5-af0b-d383926ab9e0-typha-certs\") pod \"calico-typha-5c7b7d8757-mv8pm\" (UID: \"3da5b7da-ece4-41b5-af0b-d383926ab9e0\") " pod="calico-system/calico-typha-5c7b7d8757-mv8pm" Feb 13 08:29:59.383452 kubelet[2614]: I0213 08:29:59.383423 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8e8b3a92-5cc2-4cc6-af08-c9aed5158746-cni-log-dir\") pod \"calico-node-ld62c\" (UID: \"8e8b3a92-5cc2-4cc6-af08-c9aed5158746\") " pod="calico-system/calico-node-ld62c" Feb 13 08:29:59.180000 audit[3156]: NETFILTER_CFG table=nat:106 family=2 entries=20 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.180000 audit[3156]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7fffad066f30 a2=0 a3=7fffad066f1c items=0 ppid=2919 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.483656 kubelet[2614]: I0213 08:29:59.483603 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/767512d8-ec8b-4a84-be29-2de84e2dbb6e-socket-dir\") pod \"csi-node-driver-8wh4k\" (UID: \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\") " pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:29:59.483656 kubelet[2614]: I0213 08:29:59.483652 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/767512d8-ec8b-4a84-be29-2de84e2dbb6e-registration-dir\") pod \"csi-node-driver-8wh4k\" (UID: \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\") " pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:29:59.483734 kubelet[2614]: I0213 08:29:59.483666 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/767512d8-ec8b-4a84-be29-2de84e2dbb6e-kubelet-dir\") pod \"csi-node-driver-8wh4k\" (UID: \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\") " pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:29:59.483734 kubelet[2614]: I0213 08:29:59.483728 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/767512d8-ec8b-4a84-be29-2de84e2dbb6e-varrun\") pod \"csi-node-driver-8wh4k\" (UID: \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\") " pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:29:59.483996 kubelet[2614]: E0213 08:29:59.483957 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.483996 kubelet[2614]: W0213 08:29:59.483966 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.483996 kubelet[2614]: E0213 08:29:59.483976 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.483996 kubelet[2614]: I0213 08:29:59.483989 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b27vh\" (UniqueName: \"kubernetes.io/projected/767512d8-ec8b-4a84-be29-2de84e2dbb6e-kube-api-access-b27vh\") pod \"csi-node-driver-8wh4k\" (UID: \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\") " pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:29:59.484138 kubelet[2614]: E0213 08:29:59.484100 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484138 kubelet[2614]: W0213 08:29:59.484107 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484138 kubelet[2614]: E0213 08:29:59.484117 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484273 kubelet[2614]: E0213 08:29:59.484238 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484273 kubelet[2614]: W0213 08:29:59.484245 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484273 kubelet[2614]: E0213 08:29:59.484253 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484371 kubelet[2614]: E0213 08:29:59.484364 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484371 kubelet[2614]: W0213 08:29:59.484371 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484412 kubelet[2614]: E0213 08:29:59.484380 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484493 kubelet[2614]: E0213 08:29:59.484486 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484514 kubelet[2614]: W0213 08:29:59.484493 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484514 kubelet[2614]: E0213 08:29:59.484502 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484615 kubelet[2614]: E0213 08:29:59.484610 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484615 kubelet[2614]: W0213 08:29:59.484614 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484652 kubelet[2614]: E0213 08:29:59.484621 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484730 kubelet[2614]: E0213 08:29:59.484725 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484730 kubelet[2614]: W0213 08:29:59.484729 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484770 kubelet[2614]: E0213 08:29:59.484737 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484832 kubelet[2614]: E0213 08:29:59.484828 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484853 kubelet[2614]: W0213 08:29:59.484832 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484853 kubelet[2614]: E0213 08:29:59.484840 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484907 kubelet[2614]: E0213 08:29:59.484902 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484928 kubelet[2614]: W0213 08:29:59.484907 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.484928 kubelet[2614]: E0213 08:29:59.484914 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.484989 kubelet[2614]: E0213 08:29:59.484984 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.484989 kubelet[2614]: W0213 08:29:59.484988 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485026 kubelet[2614]: E0213 08:29:59.484994 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485097 kubelet[2614]: E0213 08:29:59.485090 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485097 kubelet[2614]: W0213 08:29:59.485097 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485155 kubelet[2614]: E0213 08:29:59.485106 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485196 kubelet[2614]: E0213 08:29:59.485190 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485230 kubelet[2614]: W0213 08:29:59.485196 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485230 kubelet[2614]: E0213 08:29:59.485205 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485298 kubelet[2614]: E0213 08:29:59.485291 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485335 kubelet[2614]: W0213 08:29:59.485298 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485335 kubelet[2614]: E0213 08:29:59.485311 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485407 kubelet[2614]: E0213 08:29:59.485401 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485441 kubelet[2614]: W0213 08:29:59.485406 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485441 kubelet[2614]: E0213 08:29:59.485417 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485517 kubelet[2614]: E0213 08:29:59.485511 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485555 kubelet[2614]: W0213 08:29:59.485517 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485555 kubelet[2614]: E0213 08:29:59.485527 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485666 kubelet[2614]: E0213 08:29:59.485658 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485666 kubelet[2614]: W0213 08:29:59.485664 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.485727 kubelet[2614]: E0213 08:29:59.485671 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.485989 kubelet[2614]: E0213 08:29:59.485982 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.485989 kubelet[2614]: W0213 08:29:59.485988 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.486043 kubelet[2614]: E0213 08:29:59.485998 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.565558 kernel: audit: type=1325 audit(1707812999.180:1091): table=nat:106 family=2 entries=20 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.565592 kernel: audit: type=1300 audit(1707812999.180:1091): arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7fffad066f30 a2=0 a3=7fffad066f1c items=0 ppid=2919 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.565607 kernel: audit: type=1327 audit(1707812999.180:1091): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:59.180000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:59.586273 kubelet[2614]: E0213 08:29:59.586245 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586273 kubelet[2614]: W0213 08:29:59.586256 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586273 kubelet[2614]: E0213 08:29:59.586267 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.586419 kubelet[2614]: E0213 08:29:59.586382 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586419 kubelet[2614]: W0213 08:29:59.586388 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586419 kubelet[2614]: E0213 08:29:59.586395 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.586501 kubelet[2614]: E0213 08:29:59.586485 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586501 kubelet[2614]: W0213 08:29:59.586491 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586501 kubelet[2614]: E0213 08:29:59.586500 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.586582 kubelet[2614]: E0213 08:29:59.586576 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586582 kubelet[2614]: W0213 08:29:59.586581 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586622 kubelet[2614]: E0213 08:29:59.586588 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.586678 kubelet[2614]: E0213 08:29:59.586672 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586678 kubelet[2614]: W0213 08:29:59.586677 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586719 kubelet[2614]: E0213 08:29:59.586685 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.586800 kubelet[2614]: E0213 08:29:59.586795 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586800 kubelet[2614]: W0213 08:29:59.586800 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586839 kubelet[2614]: E0213 08:29:59.586810 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.586914 kubelet[2614]: E0213 08:29:59.586909 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.586914 kubelet[2614]: W0213 08:29:59.586914 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.586977 kubelet[2614]: E0213 08:29:59.586920 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587066 kubelet[2614]: E0213 08:29:59.587058 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587066 kubelet[2614]: W0213 08:29:59.587066 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587126 kubelet[2614]: E0213 08:29:59.587078 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587172 kubelet[2614]: E0213 08:29:59.587166 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587172 kubelet[2614]: W0213 08:29:59.587171 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587231 kubelet[2614]: E0213 08:29:59.587181 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587265 kubelet[2614]: E0213 08:29:59.587259 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587298 kubelet[2614]: W0213 08:29:59.587265 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587298 kubelet[2614]: E0213 08:29:59.587274 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587376 kubelet[2614]: E0213 08:29:59.587370 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587376 kubelet[2614]: W0213 08:29:59.587376 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587434 kubelet[2614]: E0213 08:29:59.587387 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587520 kubelet[2614]: E0213 08:29:59.587514 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587520 kubelet[2614]: W0213 08:29:59.587519 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587579 kubelet[2614]: E0213 08:29:59.587530 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587657 kubelet[2614]: E0213 08:29:59.587651 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587657 kubelet[2614]: W0213 08:29:59.587656 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587716 kubelet[2614]: E0213 08:29:59.587666 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587757 kubelet[2614]: E0213 08:29:59.587751 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587757 kubelet[2614]: W0213 08:29:59.587757 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587811 kubelet[2614]: E0213 08:29:59.587766 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587849 kubelet[2614]: E0213 08:29:59.587843 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587871 kubelet[2614]: W0213 08:29:59.587849 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587871 kubelet[2614]: E0213 08:29:59.587857 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.587924 kubelet[2614]: E0213 08:29:59.587919 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.587962 kubelet[2614]: W0213 08:29:59.587923 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.587962 kubelet[2614]: E0213 08:29:59.587933 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588024 kubelet[2614]: E0213 08:29:59.588017 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588057 kubelet[2614]: W0213 08:29:59.588024 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588057 kubelet[2614]: E0213 08:29:59.588034 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588153 kubelet[2614]: E0213 08:29:59.588147 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588188 kubelet[2614]: W0213 08:29:59.588153 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588188 kubelet[2614]: E0213 08:29:59.588163 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588244 kubelet[2614]: E0213 08:29:59.588238 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588276 kubelet[2614]: W0213 08:29:59.588244 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588276 kubelet[2614]: E0213 08:29:59.588254 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588333 kubelet[2614]: E0213 08:29:59.588327 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588366 kubelet[2614]: W0213 08:29:59.588332 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588366 kubelet[2614]: E0213 08:29:59.588343 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588448 kubelet[2614]: E0213 08:29:59.588442 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588448 kubelet[2614]: W0213 08:29:59.588447 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588504 kubelet[2614]: E0213 08:29:59.588457 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588538 kubelet[2614]: E0213 08:29:59.588533 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588575 kubelet[2614]: W0213 08:29:59.588538 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588575 kubelet[2614]: E0213 08:29:59.588548 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588632 kubelet[2614]: E0213 08:29:59.588627 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588632 kubelet[2614]: W0213 08:29:59.588632 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588688 kubelet[2614]: E0213 08:29:59.588643 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588724 kubelet[2614]: E0213 08:29:59.588718 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588756 kubelet[2614]: W0213 08:29:59.588724 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588756 kubelet[2614]: E0213 08:29:59.588734 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588828 kubelet[2614]: E0213 08:29:59.588822 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588828 kubelet[2614]: W0213 08:29:59.588828 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588888 kubelet[2614]: E0213 08:29:59.588837 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.588916 kubelet[2614]: E0213 08:29:59.588913 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.588958 kubelet[2614]: W0213 08:29:59.588919 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.588958 kubelet[2614]: E0213 08:29:59.588930 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.589029 kubelet[2614]: E0213 08:29:59.589024 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.589029 kubelet[2614]: W0213 08:29:59.589029 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.589075 kubelet[2614]: E0213 08:29:59.589037 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.426000 audit[3183]: NETFILTER_CFG table=filter:107 family=2 entries=14 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.639519 kubelet[2614]: E0213 08:29:59.639502 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.639519 kubelet[2614]: W0213 08:29:59.639514 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.639614 kubelet[2614]: E0213 08:29:59.639527 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.426000 audit[3183]: SYSCALL arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffe59204560 a2=0 a3=7ffe5920454c items=0 ppid=2919 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.688317 kubelet[2614]: E0213 08:29:59.688306 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.688317 kubelet[2614]: W0213 08:29:59.688316 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.688407 kubelet[2614]: E0213 08:29:59.688330 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.688440 kubelet[2614]: E0213 08:29:59.688433 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.688440 kubelet[2614]: W0213 08:29:59.688439 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.688493 kubelet[2614]: E0213 08:29:59.688449 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.779758 kernel: audit: type=1325 audit(1707812999.426:1092): table=filter:107 family=2 entries=14 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.779794 kernel: audit: type=1300 audit(1707812999.426:1092): arch=c000003e syscall=46 success=yes exit=4732 a0=3 a1=7ffe59204560 a2=0 a3=7ffe5920454c items=0 ppid=2919 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.426000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:59.623000 audit[3183]: NETFILTER_CFG table=nat:108 family=2 entries=20 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:29:59.623000 audit[3183]: SYSCALL arch=c000003e syscall=46 success=yes exit=5340 a0=3 a1=7ffe59204560 a2=0 a3=7ffe5920454c items=0 ppid=2919 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.623000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:29:59.789334 kubelet[2614]: E0213 08:29:59.789290 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.789334 kubelet[2614]: W0213 08:29:59.789299 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.789334 kubelet[2614]: E0213 08:29:59.789310 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.789480 kubelet[2614]: E0213 08:29:59.789437 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.789480 kubelet[2614]: W0213 08:29:59.789443 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.789480 kubelet[2614]: E0213 08:29:59.789451 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.839832 kubelet[2614]: E0213 08:29:59.839815 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.839832 kubelet[2614]: W0213 08:29:59.839829 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.839961 kubelet[2614]: E0213 08:29:59.839847 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.839990 env[1475]: time="2024-02-13T08:29:59.839865014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5c7b7d8757-mv8pm,Uid:3da5b7da-ece4-41b5-af0b-d383926ab9e0,Namespace:calico-system,Attempt:0,}" Feb 13 08:29:59.847301 env[1475]: time="2024-02-13T08:29:59.847251422Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:59.847301 env[1475]: time="2024-02-13T08:29:59.847278294Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:59.847301 env[1475]: time="2024-02-13T08:29:59.847287945Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:59.847445 env[1475]: time="2024-02-13T08:29:59.847368832Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/1a7db042edb6eca94ac1ae5029cc56d7a3eda26bfaf7e19b9b0ea9581cbe6c99 pid=3245 runtime=io.containerd.runc.v2 Feb 13 08:29:59.854944 systemd[1]: Started cri-containerd-1a7db042edb6eca94ac1ae5029cc56d7a3eda26bfaf7e19b9b0ea9581cbe6c99.scope. Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit: BPF prog-id=127 op=LOAD Feb 13 08:29:59.861000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.861000 audit[3254]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000147c48 a2=10 a3=1c items=0 ppid=3245 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161376462303432656462366563613934616331616535303239636335 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001476b0 a2=3c a3=c items=0 ppid=3245 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161376462303432656462366563613934616331616535303239636335 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit: BPF prog-id=128 op=LOAD Feb 13 08:29:59.862000 audit[3254]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001479d8 a2=78 a3=c000308520 items=0 ppid=3245 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161376462303432656462366563613934616331616535303239636335 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit: BPF prog-id=129 op=LOAD Feb 13 08:29:59.862000 audit[3254]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000147770 a2=78 a3=c000308568 items=0 ppid=3245 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161376462303432656462366563613934616331616535303239636335 Feb 13 08:29:59.862000 audit: BPF prog-id=129 op=UNLOAD Feb 13 08:29:59.862000 audit: BPF prog-id=128 op=UNLOAD Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { perfmon } for pid=3254 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit[3254]: AVC avc: denied { bpf } for pid=3254 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.862000 audit: BPF prog-id=130 op=LOAD Feb 13 08:29:59.862000 audit[3254]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000147c30 a2=78 a3=c000308978 items=0 ppid=3245 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161376462303432656462366563613934616331616535303239636335 Feb 13 08:29:59.870222 env[1475]: time="2024-02-13T08:29:59.870159956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ld62c,Uid:8e8b3a92-5cc2-4cc6-af08-c9aed5158746,Namespace:calico-system,Attempt:0,}" Feb 13 08:29:59.877315 env[1475]: time="2024-02-13T08:29:59.877270365Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 08:29:59.877315 env[1475]: time="2024-02-13T08:29:59.877297653Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 08:29:59.877315 env[1475]: time="2024-02-13T08:29:59.877307371Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 08:29:59.877481 env[1475]: time="2024-02-13T08:29:59.877422794Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2 pid=3276 runtime=io.containerd.runc.v2 Feb 13 08:29:59.884551 env[1475]: time="2024-02-13T08:29:59.884515629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5c7b7d8757-mv8pm,Uid:3da5b7da-ece4-41b5-af0b-d383926ab9e0,Namespace:calico-system,Attempt:0,} returns sandbox id \"1a7db042edb6eca94ac1ae5029cc56d7a3eda26bfaf7e19b9b0ea9581cbe6c99\"" Feb 13 08:29:59.884930 systemd[1]: Started cri-containerd-ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2.scope. Feb 13 08:29:59.885653 env[1475]: time="2024-02-13T08:29:59.885631928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\"" Feb 13 08:29:59.890002 kubelet[2614]: E0213 08:29:59.889956 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.890002 kubelet[2614]: W0213 08:29:59.889967 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.890002 kubelet[2614]: E0213 08:29:59.889981 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit: BPF prog-id=131 op=LOAD Feb 13 08:29:59.891000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.891000 audit[3287]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000197c48 a2=10 a3=1c items=0 ppid=3276 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.891000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306630313333356666383661346537633633336235626138653637 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=c items=0 ppid=3276 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306630313333356666383661346537633633336235626138653637 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit: BPF prog-id=132 op=LOAD Feb 13 08:29:59.892000 audit[3287]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c00032cad0 items=0 ppid=3276 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306630313333356666383661346537633633336235626138653637 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit: BPF prog-id=133 op=LOAD Feb 13 08:29:59.892000 audit[3287]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=c00032cb18 items=0 ppid=3276 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306630313333356666383661346537633633336235626138653637 Feb 13 08:29:59.892000 audit: BPF prog-id=133 op=UNLOAD Feb 13 08:29:59.892000 audit: BPF prog-id=132 op=UNLOAD Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { perfmon } for pid=3287 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit[3287]: AVC avc: denied { bpf } for pid=3287 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:29:59.892000 audit: BPF prog-id=134 op=LOAD Feb 13 08:29:59.892000 audit[3287]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c00032cf28 items=0 ppid=3276 pid=3287 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:29:59.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6165306630313333356666383661346537633633336235626138653637 Feb 13 08:29:59.898708 env[1475]: time="2024-02-13T08:29:59.898675547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ld62c,Uid:8e8b3a92-5cc2-4cc6-af08-c9aed5158746,Namespace:calico-system,Attempt:0,} returns sandbox id \"ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2\"" Feb 13 08:29:59.990841 kubelet[2614]: E0213 08:29:59.990824 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:29:59.990841 kubelet[2614]: W0213 08:29:59.990837 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:29:59.990971 kubelet[2614]: E0213 08:29:59.990852 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:00.040384 kubelet[2614]: E0213 08:30:00.040336 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:00.040384 kubelet[2614]: W0213 08:30:00.040347 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:00.040384 kubelet[2614]: E0213 08:30:00.040361 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:00.757492 sshd[3129]: Failed password for root from 43.153.15.221 port 59402 ssh2 Feb 13 08:30:01.265494 kubelet[2614]: E0213 08:30:01.265398 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:01.749023 sshd[3129]: Received disconnect from 43.153.15.221 port 59402:11: Bye Bye [preauth] Feb 13 08:30:01.749023 sshd[3129]: Disconnected from authenticating user root 43.153.15.221 port 59402 [preauth] Feb 13 08:30:01.751296 systemd[1]: sshd@8-145.40.67.89:22-43.153.15.221:59402.service: Deactivated successfully. Feb 13 08:30:01.750000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-145.40.67.89:22-43.153.15.221:59402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:30:03.265271 kubelet[2614]: E0213 08:30:03.265200 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:05.265197 kubelet[2614]: E0213 08:30:05.265091 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:06.111867 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount613331282.mount: Deactivated successfully. Feb 13 08:30:07.265124 kubelet[2614]: E0213 08:30:07.265080 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:09.265108 kubelet[2614]: E0213 08:30:09.264999 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:11.264763 kubelet[2614]: E0213 08:30:11.264716 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:13.264342 kubelet[2614]: E0213 08:30:13.264288 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:15.265027 kubelet[2614]: E0213 08:30:15.264977 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:17.265135 kubelet[2614]: E0213 08:30:17.265061 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:19.265222 kubelet[2614]: E0213 08:30:19.265156 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:21.264200 kubelet[2614]: E0213 08:30:21.264152 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:23.265127 kubelet[2614]: E0213 08:30:23.265081 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:25.265323 kubelet[2614]: E0213 08:30:25.265217 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:27.265302 kubelet[2614]: E0213 08:30:27.265201 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:29.264243 kubelet[2614]: E0213 08:30:29.264198 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:31.265473 kubelet[2614]: E0213 08:30:31.265301 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:33.265699 kubelet[2614]: E0213 08:30:33.265604 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:35.264567 kubelet[2614]: E0213 08:30:35.264520 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:35.603000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.631893 kernel: kauditd_printk_skb: 119 callbacks suppressed Feb 13 08:30:35.631982 kernel: audit: type=1400 audit(1707813035.603:1131): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.603000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e8bf00 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:35.723006 kernel: audit: type=1300 audit(1707813035.603:1131): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e8bf00 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:35.603000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:35.935285 kernel: audit: type=1327 audit(1707813035.603:1131): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:35.935355 kernel: audit: type=1400 audit(1707813035.603:1132): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.603000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:36.027203 kernel: audit: type=1300 audit(1707813035.603:1132): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000d615f0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:35.603000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000d615f0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:36.147654 kernel: audit: type=1327 audit(1707813035.603:1132): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:35.603000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:36.240326 kernel: audit: type=1400 audit(1707813035.886:1133): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:36.331641 kernel: audit: type=1300 audit(1707813035.886:1133): arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c002e1ffe0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c002e1ffe0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:36.431214 kernel: audit: type=1327 audit(1707813035.886:1133): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:36.618016 kernel: audit: type=1400 audit(1707813035.886:1134): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c008bc2c30 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c004061cb0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0123bc220 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c008bc2db0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c008bc2de0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:30:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:30:37.264706 kubelet[2614]: E0213 08:30:37.264692 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:39.264891 kubelet[2614]: E0213 08:30:39.264786 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:40.940000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:40.969623 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 08:30:40.969659 kernel: audit: type=1400 audit(1707813040.940:1139): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:40.940000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0013258e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:41.183553 kernel: audit: type=1300 audit(1707813040.940:1139): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0013258e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:41.183625 kernel: audit: type=1327 audit(1707813040.940:1139): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:40.940000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:41.264170 kubelet[2614]: E0213 08:30:41.264113 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:41.277834 kernel: audit: type=1400 audit(1707813040.942:1140): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:40.942000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:41.370368 kernel: audit: type=1300 audit(1707813040.942:1140): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0002fe8e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:40.942000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0002fe8e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:41.491240 kernel: audit: type=1327 audit(1707813040.942:1140): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:40.942000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:41.584670 kernel: audit: type=1400 audit(1707813040.945:1141): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:40.945000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:41.675007 kernel: audit: type=1300 audit(1707813040.945:1141): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0002fe940 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:40.945000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0002fe940 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:41.795818 kernel: audit: type=1327 audit(1707813040.945:1141): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:40.945000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:41.889350 kernel: audit: type=1400 audit(1707813040.947:1142): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:40.947000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:30:40.947000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0002fea60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:30:40.947000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:30:43.265410 kubelet[2614]: E0213 08:30:43.265310 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:45.265241 kubelet[2614]: E0213 08:30:45.265216 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:47.264974 kubelet[2614]: E0213 08:30:47.264925 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:49.265038 kubelet[2614]: E0213 08:30:49.264993 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:51.265176 kubelet[2614]: E0213 08:30:51.265156 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:53.264697 kubelet[2614]: E0213 08:30:53.264650 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:55.264943 kubelet[2614]: E0213 08:30:55.264898 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:57.265001 kubelet[2614]: E0213 08:30:57.264956 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:30:57.302482 kubelet[2614]: E0213 08:30:57.302388 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.302482 kubelet[2614]: W0213 08:30:57.302431 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.302482 kubelet[2614]: E0213 08:30:57.302476 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.303042 kubelet[2614]: E0213 08:30:57.303015 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.303163 kubelet[2614]: W0213 08:30:57.303047 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.303163 kubelet[2614]: E0213 08:30:57.303086 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.303699 kubelet[2614]: E0213 08:30:57.303627 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.303699 kubelet[2614]: W0213 08:30:57.303660 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.303699 kubelet[2614]: E0213 08:30:57.303702 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.304335 kubelet[2614]: E0213 08:30:57.304263 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.304335 kubelet[2614]: W0213 08:30:57.304297 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.304335 kubelet[2614]: E0213 08:30:57.304334 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.304889 kubelet[2614]: E0213 08:30:57.304852 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.304889 kubelet[2614]: W0213 08:30:57.304887 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.305172 kubelet[2614]: E0213 08:30:57.304947 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.305473 kubelet[2614]: E0213 08:30:57.305401 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.305473 kubelet[2614]: W0213 08:30:57.305434 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.305473 kubelet[2614]: E0213 08:30:57.305473 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.306095 kubelet[2614]: E0213 08:30:57.306009 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.306095 kubelet[2614]: W0213 08:30:57.306034 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.306095 kubelet[2614]: E0213 08:30:57.306069 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.306659 kubelet[2614]: E0213 08:30:57.306591 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.306659 kubelet[2614]: W0213 08:30:57.306627 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.306895 kubelet[2614]: E0213 08:30:57.306666 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.307202 kubelet[2614]: E0213 08:30:57.307168 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.307330 kubelet[2614]: W0213 08:30:57.307202 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.307330 kubelet[2614]: E0213 08:30:57.307241 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.307853 kubelet[2614]: E0213 08:30:57.307827 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.307984 kubelet[2614]: W0213 08:30:57.307854 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.307984 kubelet[2614]: E0213 08:30:57.307889 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.308488 kubelet[2614]: E0213 08:30:57.308397 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.308488 kubelet[2614]: W0213 08:30:57.308430 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.308488 kubelet[2614]: E0213 08:30:57.308469 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.309041 kubelet[2614]: E0213 08:30:57.308972 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.309041 kubelet[2614]: W0213 08:30:57.308999 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.309041 kubelet[2614]: E0213 08:30:57.309035 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.309602 kubelet[2614]: E0213 08:30:57.309535 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.309602 kubelet[2614]: W0213 08:30:57.309568 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.309867 kubelet[2614]: E0213 08:30:57.309611 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.310224 kubelet[2614]: E0213 08:30:57.310158 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.310224 kubelet[2614]: W0213 08:30:57.310191 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.310455 kubelet[2614]: E0213 08:30:57.310230 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.310804 kubelet[2614]: E0213 08:30:57.310741 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.310804 kubelet[2614]: W0213 08:30:57.310770 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.310804 kubelet[2614]: E0213 08:30:57.310803 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.311319 kubelet[2614]: E0213 08:30:57.311284 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.311319 kubelet[2614]: W0213 08:30:57.311318 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.311606 kubelet[2614]: E0213 08:30:57.311357 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.403167 kubelet[2614]: E0213 08:30:57.403058 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.403167 kubelet[2614]: W0213 08:30:57.403100 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.403167 kubelet[2614]: E0213 08:30:57.403146 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.403834 kubelet[2614]: E0213 08:30:57.403743 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.403834 kubelet[2614]: W0213 08:30:57.403778 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.403834 kubelet[2614]: E0213 08:30:57.403830 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.404525 kubelet[2614]: E0213 08:30:57.404435 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.404525 kubelet[2614]: W0213 08:30:57.404469 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.404525 kubelet[2614]: E0213 08:30:57.404517 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.405055 kubelet[2614]: E0213 08:30:57.405036 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.405162 kubelet[2614]: W0213 08:30:57.405061 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.405162 kubelet[2614]: E0213 08:30:57.405105 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.405690 kubelet[2614]: E0213 08:30:57.405600 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.405690 kubelet[2614]: W0213 08:30:57.405634 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.406019 kubelet[2614]: E0213 08:30:57.405770 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.406267 kubelet[2614]: E0213 08:30:57.406194 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.406267 kubelet[2614]: W0213 08:30:57.406228 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.406267 kubelet[2614]: E0213 08:30:57.406276 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.406852 kubelet[2614]: E0213 08:30:57.406799 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.406852 kubelet[2614]: W0213 08:30:57.406836 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.407144 kubelet[2614]: E0213 08:30:57.406887 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.407488 kubelet[2614]: E0213 08:30:57.407414 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.407488 kubelet[2614]: W0213 08:30:57.407449 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.407794 kubelet[2614]: E0213 08:30:57.407581 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.408052 kubelet[2614]: E0213 08:30:57.407970 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.408052 kubelet[2614]: W0213 08:30:57.408002 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.408052 kubelet[2614]: E0213 08:30:57.408044 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.408824 kubelet[2614]: E0213 08:30:57.408746 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.408824 kubelet[2614]: W0213 08:30:57.408784 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.408824 kubelet[2614]: E0213 08:30:57.408832 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.409575 kubelet[2614]: E0213 08:30:57.409484 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.409575 kubelet[2614]: W0213 08:30:57.409521 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.409575 kubelet[2614]: E0213 08:30:57.409566 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:57.410230 kubelet[2614]: E0213 08:30:57.410138 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:30:57.410230 kubelet[2614]: W0213 08:30:57.410172 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:30:57.410230 kubelet[2614]: E0213 08:30:57.410215 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:30:59.265320 kubelet[2614]: E0213 08:30:59.265259 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:01.264553 kubelet[2614]: E0213 08:31:01.264482 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:03.264810 kubelet[2614]: E0213 08:31:03.264728 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:05.265101 kubelet[2614]: E0213 08:31:05.265085 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:06.280414 kubelet[2614]: E0213 08:31:06.280395 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.280414 kubelet[2614]: W0213 08:31:06.280412 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.280693 kubelet[2614]: E0213 08:31:06.280425 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.280693 kubelet[2614]: E0213 08:31:06.280577 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.280693 kubelet[2614]: W0213 08:31:06.280586 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.280693 kubelet[2614]: E0213 08:31:06.280597 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.280788 kubelet[2614]: E0213 08:31:06.280737 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.280788 kubelet[2614]: W0213 08:31:06.280745 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.280788 kubelet[2614]: E0213 08:31:06.280753 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.280909 kubelet[2614]: E0213 08:31:06.280901 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.280909 kubelet[2614]: W0213 08:31:06.280909 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.280971 kubelet[2614]: E0213 08:31:06.280916 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281063 kubelet[2614]: E0213 08:31:06.281056 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281093 kubelet[2614]: W0213 08:31:06.281064 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281093 kubelet[2614]: E0213 08:31:06.281072 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281201 kubelet[2614]: E0213 08:31:06.281195 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281226 kubelet[2614]: W0213 08:31:06.281201 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281226 kubelet[2614]: E0213 08:31:06.281208 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281338 kubelet[2614]: E0213 08:31:06.281333 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281365 kubelet[2614]: W0213 08:31:06.281338 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281365 kubelet[2614]: E0213 08:31:06.281346 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281460 kubelet[2614]: E0213 08:31:06.281454 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281485 kubelet[2614]: W0213 08:31:06.281461 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281485 kubelet[2614]: E0213 08:31:06.281469 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281594 kubelet[2614]: E0213 08:31:06.281588 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281621 kubelet[2614]: W0213 08:31:06.281594 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281621 kubelet[2614]: E0213 08:31:06.281600 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281742 kubelet[2614]: E0213 08:31:06.281736 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281742 kubelet[2614]: W0213 08:31:06.281741 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281786 kubelet[2614]: E0213 08:31:06.281748 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281837 kubelet[2614]: E0213 08:31:06.281831 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281864 kubelet[2614]: W0213 08:31:06.281838 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281864 kubelet[2614]: E0213 08:31:06.281844 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.281944 kubelet[2614]: E0213 08:31:06.281938 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.281944 kubelet[2614]: W0213 08:31:06.281944 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.281999 kubelet[2614]: E0213 08:31:06.281951 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282084 kubelet[2614]: E0213 08:31:06.282078 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282114 kubelet[2614]: W0213 08:31:06.282084 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282114 kubelet[2614]: E0213 08:31:06.282092 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282213 kubelet[2614]: E0213 08:31:06.282206 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282250 kubelet[2614]: W0213 08:31:06.282213 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282250 kubelet[2614]: E0213 08:31:06.282222 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282306 kubelet[2614]: E0213 08:31:06.282300 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282306 kubelet[2614]: W0213 08:31:06.282306 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282359 kubelet[2614]: E0213 08:31:06.282313 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282400 kubelet[2614]: E0213 08:31:06.282394 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282400 kubelet[2614]: W0213 08:31:06.282399 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282446 kubelet[2614]: E0213 08:31:06.282406 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282487 kubelet[2614]: E0213 08:31:06.282481 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282516 kubelet[2614]: W0213 08:31:06.282487 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282516 kubelet[2614]: E0213 08:31:06.282493 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282575 kubelet[2614]: E0213 08:31:06.282569 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282575 kubelet[2614]: W0213 08:31:06.282575 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282622 kubelet[2614]: E0213 08:31:06.282581 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282664 kubelet[2614]: E0213 08:31:06.282658 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282692 kubelet[2614]: W0213 08:31:06.282665 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282692 kubelet[2614]: E0213 08:31:06.282674 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:06.282759 kubelet[2614]: E0213 08:31:06.282753 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:06.282787 kubelet[2614]: W0213 08:31:06.282759 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:06.282787 kubelet[2614]: E0213 08:31:06.282766 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:07.265067 kubelet[2614]: E0213 08:31:07.265021 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:09.264763 kubelet[2614]: E0213 08:31:09.264743 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:11.264972 kubelet[2614]: E0213 08:31:11.264882 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:12.330835 kubelet[2614]: E0213 08:31:12.330726 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:12.330835 kubelet[2614]: W0213 08:31:12.330771 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:12.330835 kubelet[2614]: E0213 08:31:12.330819 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:12.331871 kubelet[2614]: E0213 08:31:12.331395 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:12.331871 kubelet[2614]: W0213 08:31:12.331427 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:12.331871 kubelet[2614]: E0213 08:31:12.331464 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:12.332231 kubelet[2614]: E0213 08:31:12.332030 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:12.332231 kubelet[2614]: W0213 08:31:12.332062 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:12.332231 kubelet[2614]: E0213 08:31:12.332100 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:12.332727 kubelet[2614]: E0213 08:31:12.332636 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:12.332727 kubelet[2614]: W0213 08:31:12.332670 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:12.332727 kubelet[2614]: E0213 08:31:12.332708 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.265357 kubelet[2614]: E0213 08:31:13.265339 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:13.341348 kubelet[2614]: E0213 08:31:13.341280 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.341348 kubelet[2614]: W0213 08:31:13.341330 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.342539 kubelet[2614]: E0213 08:31:13.341401 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.342539 kubelet[2614]: E0213 08:31:13.341988 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.342539 kubelet[2614]: W0213 08:31:13.342026 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.342539 kubelet[2614]: E0213 08:31:13.342083 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.343239 kubelet[2614]: E0213 08:31:13.342600 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.343239 kubelet[2614]: W0213 08:31:13.342635 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.343239 kubelet[2614]: E0213 08:31:13.342691 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.343726 kubelet[2614]: E0213 08:31:13.343373 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.343726 kubelet[2614]: W0213 08:31:13.343408 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.343726 kubelet[2614]: E0213 08:31:13.343461 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.344210 kubelet[2614]: E0213 08:31:13.344003 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.344210 kubelet[2614]: W0213 08:31:13.344037 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.344210 kubelet[2614]: E0213 08:31:13.344089 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.344644 kubelet[2614]: E0213 08:31:13.344616 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.344836 kubelet[2614]: W0213 08:31:13.344653 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.344836 kubelet[2614]: E0213 08:31:13.344707 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.345243 kubelet[2614]: E0213 08:31:13.345203 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.345243 kubelet[2614]: W0213 08:31:13.345234 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.345580 kubelet[2614]: E0213 08:31:13.345288 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.345813 kubelet[2614]: E0213 08:31:13.345777 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.345813 kubelet[2614]: W0213 08:31:13.345806 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.346176 kubelet[2614]: E0213 08:31:13.345852 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.414126 kubelet[2614]: E0213 08:31:13.414066 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.414126 kubelet[2614]: W0213 08:31:13.414111 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.414632 kubelet[2614]: E0213 08:31:13.414174 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.414916 kubelet[2614]: E0213 08:31:13.414881 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.415134 kubelet[2614]: W0213 08:31:13.414915 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.415134 kubelet[2614]: E0213 08:31:13.414988 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.415557 kubelet[2614]: E0213 08:31:13.415515 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.415557 kubelet[2614]: W0213 08:31:13.415547 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.415951 kubelet[2614]: E0213 08:31:13.415599 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.416189 kubelet[2614]: E0213 08:31:13.416151 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.416189 kubelet[2614]: W0213 08:31:13.416182 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.416577 kubelet[2614]: E0213 08:31:13.416239 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.416858 kubelet[2614]: E0213 08:31:13.416818 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.416858 kubelet[2614]: W0213 08:31:13.416850 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.417305 kubelet[2614]: E0213 08:31:13.416899 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:13.417980 kubelet[2614]: E0213 08:31:13.417920 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:13.417980 kubelet[2614]: W0213 08:31:13.417977 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:13.418317 kubelet[2614]: E0213 08:31:13.418030 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:15.265255 kubelet[2614]: E0213 08:31:15.265209 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:17.264674 kubelet[2614]: E0213 08:31:17.264657 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:19.264921 kubelet[2614]: E0213 08:31:19.264876 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:21.265467 kubelet[2614]: E0213 08:31:21.265367 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:23.265281 kubelet[2614]: E0213 08:31:23.265178 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:23.318137 kubelet[2614]: E0213 08:31:23.318046 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.318137 kubelet[2614]: W0213 08:31:23.318087 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.318137 kubelet[2614]: E0213 08:31:23.318131 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.318756 kubelet[2614]: E0213 08:31:23.318676 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.318756 kubelet[2614]: W0213 08:31:23.318712 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.318756 kubelet[2614]: E0213 08:31:23.318751 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.319351 kubelet[2614]: E0213 08:31:23.319270 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.319351 kubelet[2614]: W0213 08:31:23.319305 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.319351 kubelet[2614]: E0213 08:31:23.319345 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.319988 kubelet[2614]: E0213 08:31:23.319953 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.320135 kubelet[2614]: W0213 08:31:23.319990 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.320135 kubelet[2614]: E0213 08:31:23.320031 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.320589 kubelet[2614]: E0213 08:31:23.320555 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.320704 kubelet[2614]: W0213 08:31:23.320592 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.320704 kubelet[2614]: E0213 08:31:23.320632 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.321174 kubelet[2614]: E0213 08:31:23.321140 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.321305 kubelet[2614]: W0213 08:31:23.321175 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.321305 kubelet[2614]: E0213 08:31:23.321219 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.321849 kubelet[2614]: E0213 08:31:23.321790 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.321849 kubelet[2614]: W0213 08:31:23.321830 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.322173 kubelet[2614]: E0213 08:31:23.321871 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.322523 kubelet[2614]: E0213 08:31:23.322432 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.322523 kubelet[2614]: W0213 08:31:23.322468 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.322523 kubelet[2614]: E0213 08:31:23.322508 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.323004 kubelet[2614]: E0213 08:31:23.322958 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.323004 kubelet[2614]: W0213 08:31:23.322986 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.323268 kubelet[2614]: E0213 08:31:23.323019 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.323540 kubelet[2614]: E0213 08:31:23.323469 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.323540 kubelet[2614]: W0213 08:31:23.323498 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.323540 kubelet[2614]: E0213 08:31:23.323534 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.324071 kubelet[2614]: E0213 08:31:23.324002 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.324071 kubelet[2614]: W0213 08:31:23.324030 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.324071 kubelet[2614]: E0213 08:31:23.324061 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:23.324599 kubelet[2614]: E0213 08:31:23.324529 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:23.324599 kubelet[2614]: W0213 08:31:23.324563 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:23.324599 kubelet[2614]: E0213 08:31:23.324602 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:25.264474 kubelet[2614]: E0213 08:31:25.264428 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:27.264986 kubelet[2614]: E0213 08:31:27.264967 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:29.264253 kubelet[2614]: E0213 08:31:29.264208 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:31.265519 kubelet[2614]: E0213 08:31:31.265448 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:33.265193 kubelet[2614]: E0213 08:31:33.265132 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:35.265370 kubelet[2614]: E0213 08:31:35.265267 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:35.604000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.632418 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:31:35.632463 kernel: audit: type=1400 audit(1707813095.604:1143): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.604000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001b9e960 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:35.844457 kernel: audit: type=1300 audit(1707813095.604:1143): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001b9e960 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:35.844489 kernel: audit: type=1327 audit(1707813095.604:1143): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:35.604000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:35.938201 kernel: audit: type=1400 audit(1707813095.604:1144): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.604000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.604000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00272e3c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:36.149913 kernel: audit: type=1300 audit(1707813095.604:1144): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00272e3c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:36.149948 kernel: audit: type=1327 audit(1707813095.604:1144): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:35.604000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:36.243301 kernel: audit: type=1400 audit(1707813095.886:1145): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00a197ad0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:36.432953 kernel: audit: type=1300 audit(1707813095.886:1145): arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00a197ad0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:36.432983 kernel: audit: type=1327 audit(1707813095.886:1145): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:36.616442 kernel: audit: type=1400 audit(1707813095.886:1146): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0034a8d80 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0114ab080 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0038495c0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0114ab110 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00a197c50 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:31:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:31:37.264327 kubelet[2614]: E0213 08:31:37.264310 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:39.265511 kubelet[2614]: E0213 08:31:39.265417 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:40.308124 kubelet[2614]: E0213 08:31:40.308067 2614 kubelet_node_status.go:452] "Node not becoming ready in time after startup" Feb 13 08:31:40.941000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:40.969976 kernel: kauditd_printk_skb: 14 callbacks suppressed Feb 13 08:31:40.970014 kernel: audit: type=1400 audit(1707813100.941:1151): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:40.941000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e38340 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:41.183351 kernel: audit: type=1300 audit(1707813100.941:1151): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e38340 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:41.183384 kernel: audit: type=1327 audit(1707813100.941:1151): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:40.941000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:41.264761 kubelet[2614]: E0213 08:31:41.264727 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:41.278019 kernel: audit: type=1400 audit(1707813100.943:1152): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:40.943000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:41.367551 kernel: audit: type=1300 audit(1707813100.943:1152): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e38360 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:40.943000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e38360 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:41.488287 kernel: audit: type=1327 audit(1707813100.943:1152): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:40.943000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:40.945000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:41.672101 kernel: audit: type=1400 audit(1707813100.945:1153): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:41.672145 kernel: audit: type=1300 audit(1707813100.945:1153): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00272e600 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:40.945000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00272e600 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:40.945000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:41.886569 kernel: audit: type=1327 audit(1707813100.945:1153): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:41.886599 kernel: audit: type=1400 audit(1707813100.948:1154): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:40.948000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:31:40.948000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0028ff540 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:31:40.948000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:31:43.264684 kubelet[2614]: E0213 08:31:43.264639 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:45.265350 kubelet[2614]: E0213 08:31:45.265250 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:45.302806 kubelet[2614]: E0213 08:31:45.302707 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:31:47.264689 kubelet[2614]: E0213 08:31:47.264643 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:49.265173 kubelet[2614]: E0213 08:31:49.265129 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:50.303337 kubelet[2614]: E0213 08:31:50.303278 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:31:51.265155 kubelet[2614]: E0213 08:31:51.265100 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:53.265185 kubelet[2614]: E0213 08:31:53.265133 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:55.265312 kubelet[2614]: E0213 08:31:55.265265 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:55.303873 kubelet[2614]: E0213 08:31:55.303818 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:31:57.264773 kubelet[2614]: E0213 08:31:57.264756 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:31:58.291869 kubelet[2614]: E0213 08:31:58.291845 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.291869 kubelet[2614]: W0213 08:31:58.291862 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.292219 kubelet[2614]: E0213 08:31:58.291882 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.292219 kubelet[2614]: E0213 08:31:58.292103 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.292219 kubelet[2614]: W0213 08:31:58.292114 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.292219 kubelet[2614]: E0213 08:31:58.292129 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.292361 kubelet[2614]: E0213 08:31:58.292335 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.292361 kubelet[2614]: W0213 08:31:58.292346 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.292361 kubelet[2614]: E0213 08:31:58.292361 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.292611 kubelet[2614]: E0213 08:31:58.292573 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.292611 kubelet[2614]: W0213 08:31:58.292584 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.292611 kubelet[2614]: E0213 08:31:58.292597 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.292795 kubelet[2614]: E0213 08:31:58.292759 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.292795 kubelet[2614]: W0213 08:31:58.292767 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.292795 kubelet[2614]: E0213 08:31:58.292777 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.292904 kubelet[2614]: E0213 08:31:58.292896 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.292904 kubelet[2614]: W0213 08:31:58.292903 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.292980 kubelet[2614]: E0213 08:31:58.292912 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293116 kubelet[2614]: E0213 08:31:58.293079 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293116 kubelet[2614]: W0213 08:31:58.293086 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293116 kubelet[2614]: E0213 08:31:58.293096 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293237 kubelet[2614]: E0213 08:31:58.293196 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293237 kubelet[2614]: W0213 08:31:58.293202 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293237 kubelet[2614]: E0213 08:31:58.293211 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293322 kubelet[2614]: E0213 08:31:58.293305 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293322 kubelet[2614]: W0213 08:31:58.293311 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293322 kubelet[2614]: E0213 08:31:58.293320 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293487 kubelet[2614]: E0213 08:31:58.293450 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293487 kubelet[2614]: W0213 08:31:58.293458 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293487 kubelet[2614]: E0213 08:31:58.293467 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293608 kubelet[2614]: E0213 08:31:58.293557 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293608 kubelet[2614]: W0213 08:31:58.293563 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293608 kubelet[2614]: E0213 08:31:58.293572 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293698 kubelet[2614]: E0213 08:31:58.293661 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293698 kubelet[2614]: W0213 08:31:58.293667 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293698 kubelet[2614]: E0213 08:31:58.293675 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293783 kubelet[2614]: E0213 08:31:58.293775 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293783 kubelet[2614]: W0213 08:31:58.293781 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293844 kubelet[2614]: E0213 08:31:58.293790 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.293889 kubelet[2614]: E0213 08:31:58.293881 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.293889 kubelet[2614]: W0213 08:31:58.293888 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.293977 kubelet[2614]: E0213 08:31:58.293896 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.294011 kubelet[2614]: E0213 08:31:58.293997 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.294011 kubelet[2614]: W0213 08:31:58.294003 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.294073 kubelet[2614]: E0213 08:31:58.294012 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.294116 kubelet[2614]: E0213 08:31:58.294104 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.294116 kubelet[2614]: W0213 08:31:58.294111 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.294182 kubelet[2614]: E0213 08:31:58.294120 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.344279 kubelet[2614]: E0213 08:31:58.344169 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.344279 kubelet[2614]: W0213 08:31:58.344211 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.344279 kubelet[2614]: E0213 08:31:58.344256 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.344903 kubelet[2614]: E0213 08:31:58.344868 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.345059 kubelet[2614]: W0213 08:31:58.344903 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.345059 kubelet[2614]: E0213 08:31:58.344969 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.345704 kubelet[2614]: E0213 08:31:58.345620 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.345704 kubelet[2614]: W0213 08:31:58.345657 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.345704 kubelet[2614]: E0213 08:31:58.345706 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.346367 kubelet[2614]: E0213 08:31:58.346291 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.346367 kubelet[2614]: W0213 08:31:58.346327 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.346367 kubelet[2614]: E0213 08:31:58.346374 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.347043 kubelet[2614]: E0213 08:31:58.346968 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.347043 kubelet[2614]: W0213 08:31:58.347004 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.347381 kubelet[2614]: E0213 08:31:58.347104 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.347691 kubelet[2614]: E0213 08:31:58.347601 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.347691 kubelet[2614]: W0213 08:31:58.347636 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.347691 kubelet[2614]: E0213 08:31:58.347692 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.348333 kubelet[2614]: E0213 08:31:58.348240 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.348333 kubelet[2614]: W0213 08:31:58.348274 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.348333 kubelet[2614]: E0213 08:31:58.348320 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.348941 kubelet[2614]: E0213 08:31:58.348890 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.349075 kubelet[2614]: W0213 08:31:58.348943 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.349075 kubelet[2614]: E0213 08:31:58.349050 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.349702 kubelet[2614]: E0213 08:31:58.349613 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.349702 kubelet[2614]: W0213 08:31:58.349647 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.349702 kubelet[2614]: E0213 08:31:58.349706 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.350451 kubelet[2614]: E0213 08:31:58.350360 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.350451 kubelet[2614]: W0213 08:31:58.350388 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.350451 kubelet[2614]: E0213 08:31:58.350442 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.351120 kubelet[2614]: E0213 08:31:58.351046 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.351120 kubelet[2614]: W0213 08:31:58.351073 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.351120 kubelet[2614]: E0213 08:31:58.351128 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:58.351733 kubelet[2614]: E0213 08:31:58.351661 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:31:58.351733 kubelet[2614]: W0213 08:31:58.351697 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:31:58.351733 kubelet[2614]: E0213 08:31:58.351736 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:31:59.264500 kubelet[2614]: E0213 08:31:59.264452 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:00.304521 kubelet[2614]: E0213 08:32:00.304506 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:01.265262 kubelet[2614]: E0213 08:32:01.265203 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:03.264781 kubelet[2614]: E0213 08:32:03.264733 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:05.264764 kubelet[2614]: E0213 08:32:05.264665 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:05.305724 kubelet[2614]: E0213 08:32:05.305667 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:07.264531 kubelet[2614]: E0213 08:32:07.264460 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:09.264221 kubelet[2614]: E0213 08:32:09.264202 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:10.307186 kubelet[2614]: E0213 08:32:10.307121 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:11.264864 kubelet[2614]: E0213 08:32:11.264834 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:13.264601 kubelet[2614]: E0213 08:32:13.264544 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:15.265188 kubelet[2614]: E0213 08:32:15.265141 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:15.308738 kubelet[2614]: E0213 08:32:15.308677 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:15.329205 kubelet[2614]: E0213 08:32:15.329097 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.329205 kubelet[2614]: W0213 08:32:15.329139 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.329205 kubelet[2614]: E0213 08:32:15.329192 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.329794 kubelet[2614]: E0213 08:32:15.329732 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.329794 kubelet[2614]: W0213 08:32:15.329768 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.330088 kubelet[2614]: E0213 08:32:15.329807 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.330467 kubelet[2614]: E0213 08:32:15.330376 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.330467 kubelet[2614]: W0213 08:32:15.330412 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.330467 kubelet[2614]: E0213 08:32:15.330452 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.331134 kubelet[2614]: E0213 08:32:15.331043 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.331134 kubelet[2614]: W0213 08:32:15.331071 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.331134 kubelet[2614]: E0213 08:32:15.331110 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.331691 kubelet[2614]: E0213 08:32:15.331610 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.331691 kubelet[2614]: W0213 08:32:15.331646 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.331691 kubelet[2614]: E0213 08:32:15.331688 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.332319 kubelet[2614]: E0213 08:32:15.332237 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.332319 kubelet[2614]: W0213 08:32:15.332273 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.332319 kubelet[2614]: E0213 08:32:15.332315 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.332950 kubelet[2614]: E0213 08:32:15.332908 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.333091 kubelet[2614]: W0213 08:32:15.332955 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.333091 kubelet[2614]: E0213 08:32:15.332995 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.333584 kubelet[2614]: E0213 08:32:15.333495 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.333584 kubelet[2614]: W0213 08:32:15.333530 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.333584 kubelet[2614]: E0213 08:32:15.333570 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.334226 kubelet[2614]: E0213 08:32:15.334136 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.334226 kubelet[2614]: W0213 08:32:15.334172 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.334226 kubelet[2614]: E0213 08:32:15.334212 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.334856 kubelet[2614]: E0213 08:32:15.334795 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.334856 kubelet[2614]: W0213 08:32:15.334830 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.335176 kubelet[2614]: E0213 08:32:15.334872 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.335529 kubelet[2614]: E0213 08:32:15.335445 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.335529 kubelet[2614]: W0213 08:32:15.335490 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.335839 kubelet[2614]: E0213 08:32:15.335543 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.336108 kubelet[2614]: E0213 08:32:15.336027 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.336108 kubelet[2614]: W0213 08:32:15.336053 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.336108 kubelet[2614]: E0213 08:32:15.336086 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.336702 kubelet[2614]: E0213 08:32:15.336648 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.336702 kubelet[2614]: W0213 08:32:15.336683 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.336966 kubelet[2614]: E0213 08:32:15.336721 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.337322 kubelet[2614]: E0213 08:32:15.337247 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.337322 kubelet[2614]: W0213 08:32:15.337283 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.337322 kubelet[2614]: E0213 08:32:15.337322 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.337834 kubelet[2614]: E0213 08:32:15.337804 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.337834 kubelet[2614]: W0213 08:32:15.337832 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.338097 kubelet[2614]: E0213 08:32:15.337867 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.338503 kubelet[2614]: E0213 08:32:15.338430 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.338503 kubelet[2614]: W0213 08:32:15.338465 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.338503 kubelet[2614]: E0213 08:32:15.338503 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.339010 kubelet[2614]: E0213 08:32:15.338979 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.339010 kubelet[2614]: W0213 08:32:15.339008 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.339247 kubelet[2614]: E0213 08:32:15.339043 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.339589 kubelet[2614]: E0213 08:32:15.339524 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.339589 kubelet[2614]: W0213 08:32:15.339551 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.339589 kubelet[2614]: E0213 08:32:15.339591 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.340126 kubelet[2614]: E0213 08:32:15.340045 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.340126 kubelet[2614]: W0213 08:32:15.340081 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.340126 kubelet[2614]: E0213 08:32:15.340124 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:15.340633 kubelet[2614]: E0213 08:32:15.340597 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:15.340757 kubelet[2614]: W0213 08:32:15.340632 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:15.340757 kubelet[2614]: E0213 08:32:15.340677 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:17.264459 kubelet[2614]: E0213 08:32:17.264426 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:19.265123 kubelet[2614]: E0213 08:32:19.265078 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:20.310218 kubelet[2614]: E0213 08:32:20.310167 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:21.265159 kubelet[2614]: E0213 08:32:21.265122 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:23.265409 kubelet[2614]: E0213 08:32:23.265340 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:25.264797 kubelet[2614]: E0213 08:32:25.264753 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:25.311534 kubelet[2614]: E0213 08:32:25.311461 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:27.265153 kubelet[2614]: E0213 08:32:27.265097 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:29.264854 kubelet[2614]: E0213 08:32:29.264808 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:29.358788 kubelet[2614]: E0213 08:32:29.358693 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.358788 kubelet[2614]: W0213 08:32:29.358738 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.358788 kubelet[2614]: E0213 08:32:29.358790 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.359323 kubelet[2614]: E0213 08:32:29.359260 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.359323 kubelet[2614]: W0213 08:32:29.359285 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.359323 kubelet[2614]: E0213 08:32:29.359320 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.359836 kubelet[2614]: E0213 08:32:29.359769 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.359836 kubelet[2614]: W0213 08:32:29.359794 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.359836 kubelet[2614]: E0213 08:32:29.359827 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.360474 kubelet[2614]: E0213 08:32:29.360401 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.360474 kubelet[2614]: W0213 08:32:29.360435 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.360474 kubelet[2614]: E0213 08:32:29.360473 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.361050 kubelet[2614]: E0213 08:32:29.360982 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.361050 kubelet[2614]: W0213 08:32:29.361008 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.361050 kubelet[2614]: E0213 08:32:29.361045 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.361601 kubelet[2614]: E0213 08:32:29.361528 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.361601 kubelet[2614]: W0213 08:32:29.361562 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.361601 kubelet[2614]: E0213 08:32:29.361601 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.362234 kubelet[2614]: E0213 08:32:29.362198 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.362234 kubelet[2614]: W0213 08:32:29.362233 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.362462 kubelet[2614]: E0213 08:32:29.362272 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.362800 kubelet[2614]: E0213 08:32:29.362754 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.362800 kubelet[2614]: W0213 08:32:29.362780 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.363181 kubelet[2614]: E0213 08:32:29.362815 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.363355 kubelet[2614]: E0213 08:32:29.363317 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.363355 kubelet[2614]: W0213 08:32:29.363350 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.363583 kubelet[2614]: E0213 08:32:29.363389 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.363889 kubelet[2614]: E0213 08:32:29.363864 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.364049 kubelet[2614]: W0213 08:32:29.363889 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.364049 kubelet[2614]: E0213 08:32:29.363921 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.364502 kubelet[2614]: E0213 08:32:29.364423 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.364502 kubelet[2614]: W0213 08:32:29.364456 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.364502 kubelet[2614]: E0213 08:32:29.364494 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:29.365043 kubelet[2614]: E0213 08:32:29.364974 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:29.365043 kubelet[2614]: W0213 08:32:29.365000 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:29.365043 kubelet[2614]: E0213 08:32:29.365032 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:30.312813 kubelet[2614]: E0213 08:32:30.312778 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:31.264448 kubelet[2614]: E0213 08:32:31.264402 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:32.346462 systemd[1]: Started sshd@9-145.40.67.89:22-161.35.108.241:57868.service. Feb 13 08:32:32.345000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.89:22-161.35.108.241:57868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:32.461913 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:32:32.461992 kernel: audit: type=1130 audit(1707813152.345:1155): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.89:22-161.35.108.241:57868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:32.798614 sshd[3483]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:32:32.797000 audit[3483]: USER_AUTH pid=3483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:32:32.888120 kernel: audit: type=1100 audit(1707813152.797:1156): pid=3483 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:32:33.264709 kubelet[2614]: E0213 08:32:33.264658 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:34.814998 sshd[3483]: Failed password for root from 161.35.108.241 port 57868 ssh2 Feb 13 08:32:35.265366 kubelet[2614]: E0213 08:32:35.265285 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:35.313924 kubelet[2614]: E0213 08:32:35.313856 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:35.605000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.605000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024ed5f0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:35.724814 sshd[3483]: Received disconnect from 161.35.108.241 port 57868:11: Bye Bye [preauth] Feb 13 08:32:35.724814 sshd[3483]: Disconnected from authenticating user root 161.35.108.241 port 57868 [preauth] Feb 13 08:32:35.725430 systemd[1]: sshd@9-145.40.67.89:22-161.35.108.241:57868.service: Deactivated successfully. Feb 13 08:32:35.821012 kernel: audit: type=1400 audit(1707813155.605:1157): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.821046 kernel: audit: type=1300 audit(1707813155.605:1157): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024ed5f0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:35.821061 kernel: audit: type=1327 audit(1707813155.605:1157): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:35.605000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:35.914704 kernel: audit: type=1400 audit(1707813155.605:1158): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.605000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:36.005045 kernel: audit: type=1300 audit(1707813155.605:1158): arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002d25a60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:35.605000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002d25a60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:36.125683 kernel: audit: type=1327 audit(1707813155.605:1158): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:35.605000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:36.219234 kernel: audit: type=1131 audit(1707813155.724:1159): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.89:22-161.35.108.241:57868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:35.724000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-145.40.67.89:22-161.35.108.241:57868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:36.308358 kernel: audit: type=1400 audit(1707813155.887:1160): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011d8ff00 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:32:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:32:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c005681c20 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:32:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:32:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011d934d0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:32:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:32:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0142359a0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:32:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:32:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c011d93530 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:32:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:32:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0115d47b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:32:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:32:36.934876 systemd[1]: Started sshd@10-145.40.67.89:22-43.153.15.221:60672.service. Feb 13 08:32:36.934000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.67.89:22-43.153.15.221:60672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:37.063694 sshd[3487]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:32:37.063000 audit[3487]: USER_AUTH pid=3487 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:32:37.264385 kubelet[2614]: E0213 08:32:37.264309 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:37.317661 kubelet[2614]: E0213 08:32:37.317568 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.317661 kubelet[2614]: W0213 08:32:37.317608 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.317661 kubelet[2614]: E0213 08:32:37.317650 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.318149 kubelet[2614]: E0213 08:32:37.318124 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.318258 kubelet[2614]: W0213 08:32:37.318153 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.318258 kubelet[2614]: E0213 08:32:37.318189 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.318732 kubelet[2614]: E0213 08:32:37.318658 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.318732 kubelet[2614]: W0213 08:32:37.318689 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.318732 kubelet[2614]: E0213 08:32:37.318724 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.319283 kubelet[2614]: E0213 08:32:37.319215 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.319283 kubelet[2614]: W0213 08:32:37.319246 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.319283 kubelet[2614]: E0213 08:32:37.319282 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.319814 kubelet[2614]: E0213 08:32:37.319752 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.319814 kubelet[2614]: W0213 08:32:37.319782 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.319814 kubelet[2614]: E0213 08:32:37.319817 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.320403 kubelet[2614]: E0213 08:32:37.320318 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.320403 kubelet[2614]: W0213 08:32:37.320348 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.320403 kubelet[2614]: E0213 08:32:37.320383 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.320925 kubelet[2614]: E0213 08:32:37.320898 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.320925 kubelet[2614]: W0213 08:32:37.320923 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.321163 kubelet[2614]: E0213 08:32:37.320974 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.321609 kubelet[2614]: E0213 08:32:37.321525 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.321609 kubelet[2614]: W0213 08:32:37.321555 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.321609 kubelet[2614]: E0213 08:32:37.321590 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.322137 kubelet[2614]: E0213 08:32:37.322051 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.322137 kubelet[2614]: W0213 08:32:37.322076 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.322137 kubelet[2614]: E0213 08:32:37.322108 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.322634 kubelet[2614]: E0213 08:32:37.322549 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.322634 kubelet[2614]: W0213 08:32:37.322579 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.322634 kubelet[2614]: E0213 08:32:37.322615 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.323195 kubelet[2614]: E0213 08:32:37.323125 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.323195 kubelet[2614]: W0213 08:32:37.323155 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.323195 kubelet[2614]: E0213 08:32:37.323190 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.323746 kubelet[2614]: E0213 08:32:37.323681 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.323746 kubelet[2614]: W0213 08:32:37.323710 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.323746 kubelet[2614]: E0213 08:32:37.323745 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.417869 kubelet[2614]: E0213 08:32:37.417815 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.417869 kubelet[2614]: W0213 08:32:37.417858 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.418300 kubelet[2614]: E0213 08:32:37.417907 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.418623 kubelet[2614]: E0213 08:32:37.418589 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.418750 kubelet[2614]: W0213 08:32:37.418625 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.418750 kubelet[2614]: E0213 08:32:37.418672 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.419294 kubelet[2614]: E0213 08:32:37.419202 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.419294 kubelet[2614]: W0213 08:32:37.419237 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.419294 kubelet[2614]: E0213 08:32:37.419276 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.419871 kubelet[2614]: E0213 08:32:37.419803 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.419871 kubelet[2614]: W0213 08:32:37.419836 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.419871 kubelet[2614]: E0213 08:32:37.419875 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.420553 kubelet[2614]: E0213 08:32:37.420462 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.420553 kubelet[2614]: W0213 08:32:37.420496 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.420553 kubelet[2614]: E0213 08:32:37.420536 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:37.421544 kubelet[2614]: E0213 08:32:37.421468 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:32:37.421544 kubelet[2614]: W0213 08:32:37.421503 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:32:37.421544 kubelet[2614]: E0213 08:32:37.421543 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:32:39.100549 sshd[3487]: Failed password for root from 43.153.15.221 port 60672 ssh2 Feb 13 08:32:39.264752 kubelet[2614]: E0213 08:32:39.264706 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:39.921720 sshd[3487]: Received disconnect from 43.153.15.221 port 60672:11: Bye Bye [preauth] Feb 13 08:32:39.921720 sshd[3487]: Disconnected from authenticating user root 43.153.15.221 port 60672 [preauth] Feb 13 08:32:39.924264 systemd[1]: sshd@10-145.40.67.89:22-43.153.15.221:60672.service: Deactivated successfully. Feb 13 08:32:39.923000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.67.89:22-43.153.15.221:60672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:39.952351 kernel: kauditd_printk_skb: 19 callbacks suppressed Feb 13 08:32:39.952421 kernel: audit: type=1131 audit(1707813159.923:1168): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-145.40.67.89:22-43.153.15.221:60672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:32:40.315562 kubelet[2614]: E0213 08:32:40.315391 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:40.942000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:40.942000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000556820 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:41.155838 kernel: audit: type=1400 audit(1707813160.942:1169): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:41.155911 kernel: audit: type=1300 audit(1707813160.942:1169): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000556820 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:41.155928 kernel: audit: type=1327 audit(1707813160.942:1169): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:40.942000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:41.249385 kernel: audit: type=1400 audit(1707813160.944:1170): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:40.944000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:41.265115 kubelet[2614]: E0213 08:32:41.265104 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:41.340635 kernel: audit: type=1300 audit(1707813160.944:1170): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000556880 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:40.944000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000556880 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:41.461805 kernel: audit: type=1327 audit(1707813160.944:1170): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:40.944000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:40.946000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:41.646398 kernel: audit: type=1400 audit(1707813160.946:1171): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:41.646428 kernel: audit: type=1300 audit(1707813160.946:1171): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002152040 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:40.946000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002152040 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:41.768013 kernel: audit: type=1327 audit(1707813160.946:1171): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:40.946000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:40.949000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:32:40.949000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000556b00 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:32:40.949000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:32:43.264450 kubelet[2614]: E0213 08:32:43.264433 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:45.265076 kubelet[2614]: E0213 08:32:45.265058 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:45.316746 kubelet[2614]: E0213 08:32:45.316698 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:47.264248 kubelet[2614]: E0213 08:32:47.264231 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:49.265019 kubelet[2614]: E0213 08:32:49.264946 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:50.317601 kubelet[2614]: E0213 08:32:50.317564 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:51.264945 kubelet[2614]: E0213 08:32:51.264919 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:53.265125 kubelet[2614]: E0213 08:32:53.265076 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:55.264186 kubelet[2614]: E0213 08:32:55.264137 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:55.319478 kubelet[2614]: E0213 08:32:55.319373 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:32:57.265114 kubelet[2614]: E0213 08:32:57.265094 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:32:59.265022 kubelet[2614]: E0213 08:32:59.265005 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:00.320770 kubelet[2614]: E0213 08:33:00.320663 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:01.264856 kubelet[2614]: E0213 08:33:01.264787 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:03.265174 kubelet[2614]: E0213 08:33:03.265069 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:04.330405 kubelet[2614]: E0213 08:33:04.330306 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.330405 kubelet[2614]: W0213 08:33:04.330357 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.330405 kubelet[2614]: E0213 08:33:04.330405 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.331461 kubelet[2614]: E0213 08:33:04.330893 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.331461 kubelet[2614]: W0213 08:33:04.330950 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.331461 kubelet[2614]: E0213 08:33:04.330994 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.331801 kubelet[2614]: E0213 08:33:04.331543 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.331801 kubelet[2614]: W0213 08:33:04.331577 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.331801 kubelet[2614]: E0213 08:33:04.331620 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.332326 kubelet[2614]: E0213 08:33:04.332245 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.332326 kubelet[2614]: W0213 08:33:04.332280 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.332326 kubelet[2614]: E0213 08:33:04.332319 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.332872 kubelet[2614]: E0213 08:33:04.332833 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.332872 kubelet[2614]: W0213 08:33:04.332869 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.333149 kubelet[2614]: E0213 08:33:04.332908 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.333519 kubelet[2614]: E0213 08:33:04.333439 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.333519 kubelet[2614]: W0213 08:33:04.333474 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.333519 kubelet[2614]: E0213 08:33:04.333514 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.334073 kubelet[2614]: E0213 08:33:04.334004 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.334073 kubelet[2614]: W0213 08:33:04.334030 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.334073 kubelet[2614]: E0213 08:33:04.334066 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.334641 kubelet[2614]: E0213 08:33:04.334562 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.334641 kubelet[2614]: W0213 08:33:04.334596 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.334641 kubelet[2614]: E0213 08:33:04.334637 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.335210 kubelet[2614]: E0213 08:33:04.335118 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.335210 kubelet[2614]: W0213 08:33:04.335154 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.335210 kubelet[2614]: E0213 08:33:04.335193 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.335821 kubelet[2614]: E0213 08:33:04.335792 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.335821 kubelet[2614]: W0213 08:33:04.335820 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.336093 kubelet[2614]: E0213 08:33:04.335855 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.336483 kubelet[2614]: E0213 08:33:04.336392 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.336483 kubelet[2614]: W0213 08:33:04.336426 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.336483 kubelet[2614]: E0213 08:33:04.336470 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.336976 kubelet[2614]: E0213 08:33:04.336947 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.337138 kubelet[2614]: W0213 08:33:04.336977 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.337138 kubelet[2614]: E0213 08:33:04.337013 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.337607 kubelet[2614]: E0213 08:33:04.337532 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.337607 kubelet[2614]: W0213 08:33:04.337566 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.337607 kubelet[2614]: E0213 08:33:04.337606 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.338148 kubelet[2614]: E0213 08:33:04.338060 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.338148 kubelet[2614]: W0213 08:33:04.338090 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.338148 kubelet[2614]: E0213 08:33:04.338126 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.338695 kubelet[2614]: E0213 08:33:04.338636 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.338695 kubelet[2614]: W0213 08:33:04.338671 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.338939 kubelet[2614]: E0213 08:33:04.338713 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.339268 kubelet[2614]: E0213 08:33:04.339194 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.339268 kubelet[2614]: W0213 08:33:04.339228 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.339268 kubelet[2614]: E0213 08:33:04.339267 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.426349 kubelet[2614]: E0213 08:33:04.426325 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.426349 kubelet[2614]: W0213 08:33:04.426343 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.426505 kubelet[2614]: E0213 08:33:04.426362 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.426593 kubelet[2614]: E0213 08:33:04.426581 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.426593 kubelet[2614]: W0213 08:33:04.426592 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.426679 kubelet[2614]: E0213 08:33:04.426608 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.426855 kubelet[2614]: E0213 08:33:04.426842 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.426902 kubelet[2614]: W0213 08:33:04.426854 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.426902 kubelet[2614]: E0213 08:33:04.426870 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.427084 kubelet[2614]: E0213 08:33:04.427071 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.427084 kubelet[2614]: W0213 08:33:04.427082 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.427167 kubelet[2614]: E0213 08:33:04.427097 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.427257 kubelet[2614]: E0213 08:33:04.427245 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.427257 kubelet[2614]: W0213 08:33:04.427256 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.427339 kubelet[2614]: E0213 08:33:04.427270 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.427471 kubelet[2614]: E0213 08:33:04.427459 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.427516 kubelet[2614]: W0213 08:33:04.427470 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.427516 kubelet[2614]: E0213 08:33:04.427484 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.427674 kubelet[2614]: E0213 08:33:04.427662 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.427718 kubelet[2614]: W0213 08:33:04.427677 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.427718 kubelet[2614]: E0213 08:33:04.427697 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.427897 kubelet[2614]: E0213 08:33:04.427887 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.427950 kubelet[2614]: W0213 08:33:04.427899 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.427950 kubelet[2614]: E0213 08:33:04.427922 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.428063 kubelet[2614]: E0213 08:33:04.428052 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.428063 kubelet[2614]: W0213 08:33:04.428062 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.428140 kubelet[2614]: E0213 08:33:04.428075 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.428215 kubelet[2614]: E0213 08:33:04.428206 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.428255 kubelet[2614]: W0213 08:33:04.428215 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.428255 kubelet[2614]: E0213 08:33:04.428226 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.428358 kubelet[2614]: E0213 08:33:04.428351 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.428394 kubelet[2614]: W0213 08:33:04.428358 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.428394 kubelet[2614]: E0213 08:33:04.428369 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:04.428636 kubelet[2614]: E0213 08:33:04.428625 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:04.428709 kubelet[2614]: W0213 08:33:04.428638 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:04.428709 kubelet[2614]: E0213 08:33:04.428655 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:05.265301 kubelet[2614]: E0213 08:33:05.265231 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:05.321530 kubelet[2614]: E0213 08:33:05.321478 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:07.264918 kubelet[2614]: E0213 08:33:07.264871 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:09.264314 kubelet[2614]: E0213 08:33:09.264268 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:10.323321 kubelet[2614]: E0213 08:33:10.323261 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:11.264271 kubelet[2614]: E0213 08:33:11.264226 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:13.264870 kubelet[2614]: E0213 08:33:13.264821 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:15.264177 kubelet[2614]: E0213 08:33:15.264132 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:15.324990 kubelet[2614]: E0213 08:33:15.324903 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:17.264776 kubelet[2614]: E0213 08:33:17.264756 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:19.264585 kubelet[2614]: E0213 08:33:19.264480 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:20.326704 kubelet[2614]: E0213 08:33:20.326597 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:21.265094 kubelet[2614]: E0213 08:33:21.265049 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:23.265128 kubelet[2614]: E0213 08:33:23.265081 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:25.264921 kubelet[2614]: E0213 08:33:25.264875 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:25.328194 kubelet[2614]: E0213 08:33:25.328121 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:27.264348 kubelet[2614]: E0213 08:33:27.264301 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:29.264239 kubelet[2614]: E0213 08:33:29.264188 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:30.330361 kubelet[2614]: E0213 08:33:30.330204 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:30.364252 kubelet[2614]: E0213 08:33:30.364141 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.364252 kubelet[2614]: W0213 08:33:30.364185 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.364252 kubelet[2614]: E0213 08:33:30.364237 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.364794 kubelet[2614]: E0213 08:33:30.364770 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.364909 kubelet[2614]: W0213 08:33:30.364802 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.364909 kubelet[2614]: E0213 08:33:30.364844 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.365467 kubelet[2614]: E0213 08:33:30.365377 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.365467 kubelet[2614]: W0213 08:33:30.365411 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.365467 kubelet[2614]: E0213 08:33:30.365449 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.366088 kubelet[2614]: E0213 08:33:30.366014 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.366088 kubelet[2614]: W0213 08:33:30.366040 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.366088 kubelet[2614]: E0213 08:33:30.366075 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.366648 kubelet[2614]: E0213 08:33:30.366570 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.366648 kubelet[2614]: W0213 08:33:30.366603 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.366648 kubelet[2614]: E0213 08:33:30.366641 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.367198 kubelet[2614]: E0213 08:33:30.367121 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.367198 kubelet[2614]: W0213 08:33:30.367147 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.367198 kubelet[2614]: E0213 08:33:30.367181 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.367799 kubelet[2614]: E0213 08:33:30.367740 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.367799 kubelet[2614]: W0213 08:33:30.367772 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.368165 kubelet[2614]: E0213 08:33:30.367816 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.368431 kubelet[2614]: E0213 08:33:30.368343 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.368431 kubelet[2614]: W0213 08:33:30.368376 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.368431 kubelet[2614]: E0213 08:33:30.368419 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.368924 kubelet[2614]: E0213 08:33:30.368895 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.369089 kubelet[2614]: W0213 08:33:30.368921 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.369089 kubelet[2614]: E0213 08:33:30.368979 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.369540 kubelet[2614]: E0213 08:33:30.369508 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.369664 kubelet[2614]: W0213 08:33:30.369542 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.369664 kubelet[2614]: E0213 08:33:30.369581 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.370049 kubelet[2614]: E0213 08:33:30.370022 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.370049 kubelet[2614]: W0213 08:33:30.370047 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.370280 kubelet[2614]: E0213 08:33:30.370079 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.370601 kubelet[2614]: E0213 08:33:30.370570 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.370718 kubelet[2614]: W0213 08:33:30.370604 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.370718 kubelet[2614]: E0213 08:33:30.370643 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.371317 kubelet[2614]: E0213 08:33:30.371238 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.371317 kubelet[2614]: W0213 08:33:30.371272 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.371317 kubelet[2614]: E0213 08:33:30.371310 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.371789 kubelet[2614]: E0213 08:33:30.371762 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.371915 kubelet[2614]: W0213 08:33:30.371791 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.371915 kubelet[2614]: E0213 08:33:30.371828 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.372404 kubelet[2614]: E0213 08:33:30.372315 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.372404 kubelet[2614]: W0213 08:33:30.372348 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.372404 kubelet[2614]: E0213 08:33:30.372387 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.372919 kubelet[2614]: E0213 08:33:30.372891 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.372919 kubelet[2614]: W0213 08:33:30.372917 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.373151 kubelet[2614]: E0213 08:33:30.372967 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.373504 kubelet[2614]: E0213 08:33:30.373452 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.373677 kubelet[2614]: W0213 08:33:30.373515 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.373677 kubelet[2614]: E0213 08:33:30.373555 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.374061 kubelet[2614]: E0213 08:33:30.373984 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.374061 kubelet[2614]: W0213 08:33:30.374008 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.374061 kubelet[2614]: E0213 08:33:30.374040 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.374536 kubelet[2614]: E0213 08:33:30.374505 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.374665 kubelet[2614]: W0213 08:33:30.374538 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.374665 kubelet[2614]: E0213 08:33:30.374578 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:30.375056 kubelet[2614]: E0213 08:33:30.374986 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:30.375056 kubelet[2614]: W0213 08:33:30.375012 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:30.375056 kubelet[2614]: E0213 08:33:30.375043 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:31.265019 kubelet[2614]: E0213 08:33:31.264989 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:33.264809 kubelet[2614]: E0213 08:33:33.264723 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:34.334950 systemd[1]: Started sshd@11-145.40.67.89:22-161.35.108.241:35110.service. Feb 13 08:33:34.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.67.89:22-161.35.108.241:35110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:33:34.361545 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:33:34.361580 kernel: audit: type=1130 audit(1707813214.334:1173): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.67.89:22-161.35.108.241:35110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:33:34.777674 sshd[3563]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:33:34.777000 audit[3563]: USER_AUTH pid=3563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:33:34.874935 kernel: audit: type=1100 audit(1707813214.777:1174): pid=3563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:33:35.264357 kubelet[2614]: E0213 08:33:35.264336 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:35.331657 kubelet[2614]: E0213 08:33:35.331577 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:35.605000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.605000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.791063 kernel: audit: type=1400 audit(1707813215.605:1175): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.791103 kernel: audit: type=1400 audit(1707813215.605:1176): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.791119 kernel: audit: type=1300 audit(1707813215.605:1176): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0030294a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:35.605000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0030294a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:35.912086 kernel: audit: type=1300 audit(1707813215.605:1175): arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0009fc6e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:35.605000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0009fc6e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:36.033233 kernel: audit: type=1327 audit(1707813215.605:1175): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:35.605000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:36.126934 kernel: audit: type=1327 audit(1707813215.605:1176): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:35.605000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:36.220504 kernel: audit: type=1400 audit(1707813215.887:1177): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011ff4960 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:36.412592 kernel: audit: type=1300 audit(1707813215.887:1177): arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011ff4960 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:33:35.887000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.887000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00d782980 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:35.887000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:33:35.888000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.888000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00b912930 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:35.888000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:33:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011d3a6c0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:33:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0114c1c50 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:33:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0119ab0b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:33:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:33:36.638921 sshd[3563]: Failed password for root from 161.35.108.241 port 35110 ssh2 Feb 13 08:33:36.811629 systemd[1]: Started sshd@12-145.40.67.89:22-43.153.15.221:51284.service. Feb 13 08:33:36.810000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.67.89:22-43.153.15.221:51284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:33:36.917834 sshd[3566]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:33:36.916000 audit[3566]: USER_AUTH pid=3566 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:33:37.264720 kubelet[2614]: E0213 08:33:37.264606 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:37.700103 sshd[3563]: Received disconnect from 161.35.108.241 port 35110:11: Bye Bye [preauth] Feb 13 08:33:37.700103 sshd[3563]: Disconnected from authenticating user root 161.35.108.241 port 35110 [preauth] Feb 13 08:33:37.702560 systemd[1]: sshd@11-145.40.67.89:22-161.35.108.241:35110.service: Deactivated successfully. Feb 13 08:33:37.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-145.40.67.89:22-161.35.108.241:35110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:33:39.265115 kubelet[2614]: E0213 08:33:39.265095 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:39.721911 sshd[3566]: Failed password for root from 43.153.15.221 port 51284 ssh2 Feb 13 08:33:39.779465 sshd[3566]: Received disconnect from 43.153.15.221 port 51284:11: Bye Bye [preauth] Feb 13 08:33:39.779465 sshd[3566]: Disconnected from authenticating user root 43.153.15.221 port 51284 [preauth] Feb 13 08:33:39.781967 systemd[1]: sshd@12-145.40.67.89:22-43.153.15.221:51284.service: Deactivated successfully. Feb 13 08:33:39.781000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.67.89:22-43.153.15.221:51284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:33:39.810213 kernel: kauditd_printk_skb: 19 callbacks suppressed Feb 13 08:33:39.810249 kernel: audit: type=1131 audit(1707813219.781:1186): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-145.40.67.89:22-43.153.15.221:51284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:33:40.332810 kubelet[2614]: E0213 08:33:40.332742 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:40.944000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:40.944000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00012f360 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:41.157866 kernel: audit: type=1400 audit(1707813220.944:1187): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:41.157923 kernel: audit: type=1300 audit(1707813220.944:1187): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00012f360 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:41.157949 kernel: audit: type=1327 audit(1707813220.944:1187): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:40.944000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:40.945000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:41.264597 kubelet[2614]: E0213 08:33:41.264559 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:41.343366 kernel: audit: type=1400 audit(1707813220.945:1188): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:41.343416 kernel: audit: type=1300 audit(1707813220.945:1188): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00012f3e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:40.945000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00012f3e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:41.465278 kernel: audit: type=1327 audit(1707813220.945:1188): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:40.945000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:41.561378 kernel: audit: type=1400 audit(1707813220.946:1189): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:40.946000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:41.652939 kernel: audit: type=1300 audit(1707813220.946:1189): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002353560 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:40.946000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002353560 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:41.776549 kernel: audit: type=1327 audit(1707813220.946:1189): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:40.946000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:40.950000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:33:40.950000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0009fcc60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:33:40.950000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:33:43.264983 kubelet[2614]: E0213 08:33:43.264920 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:45.264863 kubelet[2614]: E0213 08:33:45.264840 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:45.294513 kubelet[2614]: E0213 08:33:45.294459 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.294513 kubelet[2614]: W0213 08:33:45.294477 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.294513 kubelet[2614]: E0213 08:33:45.294494 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.294726 kubelet[2614]: E0213 08:33:45.294711 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.294726 kubelet[2614]: W0213 08:33:45.294724 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.294804 kubelet[2614]: E0213 08:33:45.294738 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.294937 kubelet[2614]: E0213 08:33:45.294919 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.294937 kubelet[2614]: W0213 08:33:45.294934 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.295020 kubelet[2614]: E0213 08:33:45.294947 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.295237 kubelet[2614]: E0213 08:33:45.295192 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.295237 kubelet[2614]: W0213 08:33:45.295204 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.295237 kubelet[2614]: E0213 08:33:45.295219 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.295458 kubelet[2614]: E0213 08:33:45.295415 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.295458 kubelet[2614]: W0213 08:33:45.295427 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.295458 kubelet[2614]: E0213 08:33:45.295439 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.295641 kubelet[2614]: E0213 08:33:45.295630 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.295681 kubelet[2614]: W0213 08:33:45.295642 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.295681 kubelet[2614]: E0213 08:33:45.295656 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.295858 kubelet[2614]: E0213 08:33:45.295849 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.295898 kubelet[2614]: W0213 08:33:45.295858 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.295898 kubelet[2614]: E0213 08:33:45.295868 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.296043 kubelet[2614]: E0213 08:33:45.296003 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.296043 kubelet[2614]: W0213 08:33:45.296010 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.296043 kubelet[2614]: E0213 08:33:45.296022 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.334271 kubelet[2614]: E0213 08:33:45.334175 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:45.353440 kubelet[2614]: E0213 08:33:45.353333 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.353440 kubelet[2614]: W0213 08:33:45.353375 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.353440 kubelet[2614]: E0213 08:33:45.353420 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.354125 kubelet[2614]: E0213 08:33:45.354044 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.354125 kubelet[2614]: W0213 08:33:45.354070 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.354125 kubelet[2614]: E0213 08:33:45.354107 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.354733 kubelet[2614]: E0213 08:33:45.354644 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.354733 kubelet[2614]: W0213 08:33:45.354677 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.354733 kubelet[2614]: E0213 08:33:45.354716 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.355333 kubelet[2614]: E0213 08:33:45.355244 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.355333 kubelet[2614]: W0213 08:33:45.355278 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.355333 kubelet[2614]: E0213 08:33:45.355317 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.355978 kubelet[2614]: E0213 08:33:45.355912 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.355978 kubelet[2614]: W0213 08:33:45.355965 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.356243 kubelet[2614]: E0213 08:33:45.356006 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:45.356976 kubelet[2614]: E0213 08:33:45.356888 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:45.356976 kubelet[2614]: W0213 08:33:45.356915 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:45.356976 kubelet[2614]: E0213 08:33:45.356964 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:47.264371 kubelet[2614]: E0213 08:33:47.264324 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:49.264850 kubelet[2614]: E0213 08:33:49.264728 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:50.335498 kubelet[2614]: E0213 08:33:50.335441 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:51.264842 kubelet[2614]: E0213 08:33:51.264825 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:53.264258 kubelet[2614]: E0213 08:33:53.264213 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:54.582481 update_engine[1465]: I0213 08:33:54.582370 1465 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Feb 13 08:33:54.582481 update_engine[1465]: I0213 08:33:54.582449 1465 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Feb 13 08:33:54.583489 update_engine[1465]: I0213 08:33:54.583082 1465 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Feb 13 08:33:54.584019 update_engine[1465]: I0213 08:33:54.583914 1465 omaha_request_params.cc:62] Current group set to lts Feb 13 08:33:54.584262 update_engine[1465]: I0213 08:33:54.584223 1465 update_attempter.cc:499] Already updated boot flags. Skipping. Feb 13 08:33:54.584262 update_engine[1465]: I0213 08:33:54.584244 1465 update_attempter.cc:643] Scheduling an action processor start. Feb 13 08:33:54.584531 update_engine[1465]: I0213 08:33:54.584278 1465 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 08:33:54.584531 update_engine[1465]: I0213 08:33:54.584348 1465 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Feb 13 08:33:54.584531 update_engine[1465]: I0213 08:33:54.584486 1465 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 08:33:54.584531 update_engine[1465]: I0213 08:33:54.584502 1465 omaha_request_action.cc:271] Request: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: Feb 13 08:33:54.584531 update_engine[1465]: I0213 08:33:54.584512 1465 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:33:54.585749 locksmithd[1508]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Feb 13 08:33:54.587638 update_engine[1465]: I0213 08:33:54.587558 1465 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:33:54.587839 update_engine[1465]: E0213 08:33:54.587775 1465 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:33:54.587982 update_engine[1465]: I0213 08:33:54.587951 1465 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Feb 13 08:33:55.264335 kubelet[2614]: E0213 08:33:55.264316 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:55.337089 kubelet[2614]: E0213 08:33:55.337024 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:33:57.265048 kubelet[2614]: E0213 08:33:57.265002 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:33:58.309231 kubelet[2614]: E0213 08:33:58.309128 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.309231 kubelet[2614]: W0213 08:33:58.309174 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.309231 kubelet[2614]: E0213 08:33:58.309220 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.310325 kubelet[2614]: E0213 08:33:58.309782 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.310325 kubelet[2614]: W0213 08:33:58.309817 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.310325 kubelet[2614]: E0213 08:33:58.309857 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.310670 kubelet[2614]: E0213 08:33:58.310419 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.310670 kubelet[2614]: W0213 08:33:58.310454 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.310670 kubelet[2614]: E0213 08:33:58.310493 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.311109 kubelet[2614]: E0213 08:33:58.311073 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.311109 kubelet[2614]: W0213 08:33:58.311103 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.311435 kubelet[2614]: E0213 08:33:58.311139 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.311721 kubelet[2614]: E0213 08:33:58.311656 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.311721 kubelet[2614]: W0213 08:33:58.311691 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.311977 kubelet[2614]: E0213 08:33:58.311729 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.312361 kubelet[2614]: E0213 08:33:58.312280 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.312361 kubelet[2614]: W0213 08:33:58.312316 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.312361 kubelet[2614]: E0213 08:33:58.312356 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.312978 kubelet[2614]: E0213 08:33:58.312919 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.312978 kubelet[2614]: W0213 08:33:58.312974 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.313242 kubelet[2614]: E0213 08:33:58.313011 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.313603 kubelet[2614]: E0213 08:33:58.313523 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.313603 kubelet[2614]: W0213 08:33:58.313558 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.313603 kubelet[2614]: E0213 08:33:58.313604 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.314284 kubelet[2614]: E0213 08:33:58.314194 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.314284 kubelet[2614]: W0213 08:33:58.314232 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.314284 kubelet[2614]: E0213 08:33:58.314276 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.314836 kubelet[2614]: E0213 08:33:58.314808 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.315024 kubelet[2614]: W0213 08:33:58.314841 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.315024 kubelet[2614]: E0213 08:33:58.314877 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.315497 kubelet[2614]: E0213 08:33:58.315416 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.315497 kubelet[2614]: W0213 08:33:58.315451 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.315497 kubelet[2614]: E0213 08:33:58.315490 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:58.316054 kubelet[2614]: E0213 08:33:58.315977 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:33:58.316054 kubelet[2614]: W0213 08:33:58.316006 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:33:58.316054 kubelet[2614]: E0213 08:33:58.316039 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:33:59.264519 kubelet[2614]: E0213 08:33:59.264501 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:00.337417 kubelet[2614]: E0213 08:34:00.337399 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:01.264746 kubelet[2614]: E0213 08:34:01.264537 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:03.264623 kubelet[2614]: E0213 08:34:03.264546 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:04.523297 update_engine[1465]: I0213 08:34:04.523164 1465 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:34:04.524205 update_engine[1465]: I0213 08:34:04.523644 1465 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:34:04.524205 update_engine[1465]: E0213 08:34:04.523846 1465 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:34:04.524205 update_engine[1465]: I0213 08:34:04.524057 1465 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Feb 13 08:34:05.264522 kubelet[2614]: E0213 08:34:05.264428 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:05.270829 kubelet[2614]: E0213 08:34:05.270748 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:05.270829 kubelet[2614]: W0213 08:34:05.270788 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:05.270829 kubelet[2614]: E0213 08:34:05.270831 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:05.271480 kubelet[2614]: E0213 08:34:05.271383 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:05.271480 kubelet[2614]: W0213 08:34:05.271418 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:05.271480 kubelet[2614]: E0213 08:34:05.271457 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:05.272036 kubelet[2614]: E0213 08:34:05.271965 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:05.272036 kubelet[2614]: W0213 08:34:05.271994 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:05.272036 kubelet[2614]: E0213 08:34:05.272029 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:05.272643 kubelet[2614]: E0213 08:34:05.272552 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:05.272643 kubelet[2614]: W0213 08:34:05.272587 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:05.272643 kubelet[2614]: E0213 08:34:05.272626 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:05.339218 kubelet[2614]: E0213 08:34:05.339114 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:07.265184 kubelet[2614]: E0213 08:34:07.265123 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:09.265367 kubelet[2614]: E0213 08:34:09.265298 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:10.340839 kubelet[2614]: E0213 08:34:10.340749 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:11.264942 kubelet[2614]: E0213 08:34:11.264889 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:13.265075 kubelet[2614]: E0213 08:34:13.265028 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:14.523277 update_engine[1465]: I0213 08:34:14.523159 1465 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:34:14.524143 update_engine[1465]: I0213 08:34:14.523623 1465 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:34:14.524143 update_engine[1465]: E0213 08:34:14.523833 1465 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:34:14.524143 update_engine[1465]: I0213 08:34:14.524040 1465 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Feb 13 08:34:15.264971 kubelet[2614]: E0213 08:34:15.264852 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:15.342296 kubelet[2614]: E0213 08:34:15.342190 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:17.265126 kubelet[2614]: E0213 08:34:17.265079 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:19.264985 kubelet[2614]: E0213 08:34:19.264964 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:20.344288 kubelet[2614]: E0213 08:34:20.344239 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:21.264446 kubelet[2614]: E0213 08:34:21.264401 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:23.264653 kubelet[2614]: E0213 08:34:23.264609 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:24.523318 update_engine[1465]: I0213 08:34:24.523188 1465 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:34:24.524176 update_engine[1465]: I0213 08:34:24.523676 1465 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:34:24.524176 update_engine[1465]: E0213 08:34:24.523887 1465 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:34:24.524176 update_engine[1465]: I0213 08:34:24.524075 1465 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 08:34:24.524176 update_engine[1465]: I0213 08:34:24.524091 1465 omaha_request_action.cc:621] Omaha request response: Feb 13 08:34:24.524597 update_engine[1465]: E0213 08:34:24.524430 1465 omaha_request_action.cc:640] Omaha request network transfer failed. Feb 13 08:34:24.524597 update_engine[1465]: I0213 08:34:24.524463 1465 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Feb 13 08:34:24.524597 update_engine[1465]: I0213 08:34:24.524473 1465 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:34:24.524597 update_engine[1465]: I0213 08:34:24.524482 1465 update_attempter.cc:306] Processing Done. Feb 13 08:34:24.524597 update_engine[1465]: E0213 08:34:24.524507 1465 update_attempter.cc:619] Update failed. Feb 13 08:34:24.524597 update_engine[1465]: I0213 08:34:24.524516 1465 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Feb 13 08:34:24.524597 update_engine[1465]: I0213 08:34:24.524525 1465 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Feb 13 08:34:24.524597 update_engine[1465]: I0213 08:34:24.524535 1465 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Feb 13 08:34:24.525403 update_engine[1465]: I0213 08:34:24.524689 1465 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Feb 13 08:34:24.525403 update_engine[1465]: I0213 08:34:24.524740 1465 omaha_request_action.cc:270] Posting an Omaha request to disabled Feb 13 08:34:24.525403 update_engine[1465]: I0213 08:34:24.524750 1465 omaha_request_action.cc:271] Request: Feb 13 08:34:24.525403 update_engine[1465]: Feb 13 08:34:24.525403 update_engine[1465]: Feb 13 08:34:24.525403 update_engine[1465]: Feb 13 08:34:24.525403 update_engine[1465]: Feb 13 08:34:24.525403 update_engine[1465]: Feb 13 08:34:24.525403 update_engine[1465]: Feb 13 08:34:24.525403 update_engine[1465]: I0213 08:34:24.524760 1465 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Feb 13 08:34:24.525403 update_engine[1465]: I0213 08:34:24.525106 1465 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Feb 13 08:34:24.525403 update_engine[1465]: E0213 08:34:24.525272 1465 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Feb 13 08:34:24.525403 update_engine[1465]: I0213 08:34:24.525404 1465 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Feb 13 08:34:24.526592 locksmithd[1508]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Feb 13 08:34:24.526592 locksmithd[1508]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Feb 13 08:34:24.527277 update_engine[1465]: I0213 08:34:24.525419 1465 omaha_request_action.cc:621] Omaha request response: Feb 13 08:34:24.527277 update_engine[1465]: I0213 08:34:24.525430 1465 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:34:24.527277 update_engine[1465]: I0213 08:34:24.525439 1465 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Feb 13 08:34:24.527277 update_engine[1465]: I0213 08:34:24.525446 1465 update_attempter.cc:306] Processing Done. Feb 13 08:34:24.527277 update_engine[1465]: I0213 08:34:24.525453 1465 update_attempter.cc:310] Error event sent. Feb 13 08:34:24.527277 update_engine[1465]: I0213 08:34:24.525475 1465 update_check_scheduler.cc:74] Next update check in 48m51s Feb 13 08:34:25.265263 kubelet[2614]: E0213 08:34:25.265216 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:25.346200 kubelet[2614]: E0213 08:34:25.346097 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:27.264156 kubelet[2614]: E0213 08:34:27.264109 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:29.264881 kubelet[2614]: E0213 08:34:29.264772 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:29.954334 systemd[1]: Started sshd@13-145.40.67.89:22-161.35.108.241:34876.service. Feb 13 08:34:29.953000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.89:22-161.35.108.241:34876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:29.981664 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:34:29.981775 kernel: audit: type=1130 audit(1707813269.953:1191): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.89:22-161.35.108.241:34876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:30.347271 kubelet[2614]: E0213 08:34:30.347194 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:30.408959 sshd[3613]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:34:30.408000 audit[3613]: ANOM_LOGIN_FAILURES pid=3613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:30.409075 sshd[3613]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:34:30.408000 audit[3613]: USER_AUTH pid=3613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:34:30.561913 kernel: audit: type=2100 audit(1707813270.408:1192): pid=3613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:30.561959 kernel: audit: type=1100 audit(1707813270.408:1193): pid=3613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:34:31.265444 kubelet[2614]: E0213 08:34:31.265373 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:32.625993 sshd[3613]: Failed password for root from 161.35.108.241 port 34876 ssh2 Feb 13 08:34:33.265705 kubelet[2614]: E0213 08:34:33.265602 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:33.298504 kubelet[2614]: E0213 08:34:33.298443 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.298504 kubelet[2614]: W0213 08:34:33.298485 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.298966 kubelet[2614]: E0213 08:34:33.298531 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.299129 kubelet[2614]: E0213 08:34:33.299017 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.299129 kubelet[2614]: W0213 08:34:33.299042 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.299129 kubelet[2614]: E0213 08:34:33.299076 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.299632 kubelet[2614]: E0213 08:34:33.299537 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.299632 kubelet[2614]: W0213 08:34:33.299571 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.299632 kubelet[2614]: E0213 08:34:33.299610 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.300301 kubelet[2614]: E0213 08:34:33.300253 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.300301 kubelet[2614]: W0213 08:34:33.300288 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.300659 kubelet[2614]: E0213 08:34:33.300333 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.300890 kubelet[2614]: E0213 08:34:33.300852 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.301085 kubelet[2614]: W0213 08:34:33.300897 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.301085 kubelet[2614]: E0213 08:34:33.300981 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.301617 kubelet[2614]: E0213 08:34:33.301543 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.301617 kubelet[2614]: W0213 08:34:33.301576 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.301617 kubelet[2614]: E0213 08:34:33.301615 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.302217 kubelet[2614]: E0213 08:34:33.302183 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.302363 kubelet[2614]: W0213 08:34:33.302218 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.302363 kubelet[2614]: E0213 08:34:33.302258 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.302810 kubelet[2614]: E0213 08:34:33.302778 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.302810 kubelet[2614]: W0213 08:34:33.302804 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.303129 kubelet[2614]: E0213 08:34:33.302840 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.303405 kubelet[2614]: E0213 08:34:33.303359 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.303405 kubelet[2614]: W0213 08:34:33.303399 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.303796 kubelet[2614]: E0213 08:34:33.303440 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.304039 kubelet[2614]: E0213 08:34:33.304008 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.304039 kubelet[2614]: W0213 08:34:33.304033 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.304331 kubelet[2614]: E0213 08:34:33.304073 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.304570 kubelet[2614]: E0213 08:34:33.304537 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.304714 kubelet[2614]: W0213 08:34:33.304571 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.304714 kubelet[2614]: E0213 08:34:33.304611 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.305172 kubelet[2614]: E0213 08:34:33.305139 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.305308 kubelet[2614]: W0213 08:34:33.305174 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.305308 kubelet[2614]: E0213 08:34:33.305213 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.305768 kubelet[2614]: E0213 08:34:33.305700 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.305768 kubelet[2614]: W0213 08:34:33.305725 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.305768 kubelet[2614]: E0213 08:34:33.305760 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.306314 kubelet[2614]: E0213 08:34:33.306240 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.306314 kubelet[2614]: W0213 08:34:33.306274 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.306314 kubelet[2614]: E0213 08:34:33.306313 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.306843 kubelet[2614]: E0213 08:34:33.306807 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.307033 kubelet[2614]: W0213 08:34:33.306845 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.307033 kubelet[2614]: E0213 08:34:33.306904 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.307484 kubelet[2614]: E0213 08:34:33.307412 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.307484 kubelet[2614]: W0213 08:34:33.307446 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.307484 kubelet[2614]: E0213 08:34:33.307485 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.335247 sshd[3613]: Received disconnect from 161.35.108.241 port 34876:11: Bye Bye [preauth] Feb 13 08:34:33.335247 sshd[3613]: Disconnected from authenticating user root 161.35.108.241 port 34876 [preauth] Feb 13 08:34:33.338019 systemd[1]: sshd@13-145.40.67.89:22-161.35.108.241:34876.service: Deactivated successfully. Feb 13 08:34:33.337000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.89:22-161.35.108.241:34876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:33.430825 kubelet[2614]: E0213 08:34:33.430811 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.430825 kubelet[2614]: W0213 08:34:33.430822 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.430945 kernel: audit: type=1131 audit(1707813273.337:1194): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-145.40.67.89:22-161.35.108.241:34876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:33.430974 kubelet[2614]: E0213 08:34:33.430835 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.430974 kubelet[2614]: E0213 08:34:33.430942 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.430974 kubelet[2614]: W0213 08:34:33.430947 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.430974 kubelet[2614]: E0213 08:34:33.430953 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431129 kubelet[2614]: E0213 08:34:33.431120 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431129 kubelet[2614]: W0213 08:34:33.431128 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431184 kubelet[2614]: E0213 08:34:33.431137 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431284 kubelet[2614]: E0213 08:34:33.431249 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431284 kubelet[2614]: W0213 08:34:33.431255 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431284 kubelet[2614]: E0213 08:34:33.431264 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431438 kubelet[2614]: E0213 08:34:33.431402 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431438 kubelet[2614]: W0213 08:34:33.431408 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431438 kubelet[2614]: E0213 08:34:33.431418 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431579 kubelet[2614]: E0213 08:34:33.431548 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431579 kubelet[2614]: W0213 08:34:33.431554 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431579 kubelet[2614]: E0213 08:34:33.431563 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431735 kubelet[2614]: E0213 08:34:33.431729 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431758 kubelet[2614]: W0213 08:34:33.431736 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431758 kubelet[2614]: E0213 08:34:33.431744 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431838 kubelet[2614]: E0213 08:34:33.431834 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431860 kubelet[2614]: W0213 08:34:33.431838 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431860 kubelet[2614]: E0213 08:34:33.431845 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.431908 kubelet[2614]: E0213 08:34:33.431903 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.431953 kubelet[2614]: W0213 08:34:33.431908 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.431953 kubelet[2614]: E0213 08:34:33.431915 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.432036 kubelet[2614]: E0213 08:34:33.432031 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.432057 kubelet[2614]: W0213 08:34:33.432036 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.432057 kubelet[2614]: E0213 08:34:33.432043 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.432133 kubelet[2614]: E0213 08:34:33.432129 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.432156 kubelet[2614]: W0213 08:34:33.432133 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.432156 kubelet[2614]: E0213 08:34:33.432139 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:33.432311 kubelet[2614]: E0213 08:34:33.432306 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:33.432331 kubelet[2614]: W0213 08:34:33.432311 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:33.432331 kubelet[2614]: E0213 08:34:33.432317 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:34.936070 systemd[1]: Started sshd@14-145.40.67.89:22-43.153.15.221:41886.service. Feb 13 08:34:34.935000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.67.89:22-43.153.15.221:41886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:35.029020 kernel: audit: type=1130 audit(1707813274.935:1195): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.67.89:22-43.153.15.221:41886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:35.102863 sshd[3647]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:34:35.102000 audit[3647]: ANOM_LOGIN_FAILURES pid=3647 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:35.103139 sshd[3647]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:34:35.102000 audit[3647]: USER_AUTH pid=3647 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:34:35.265065 kubelet[2614]: E0213 08:34:35.264932 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:35.267503 kernel: audit: type=2100 audit(1707813275.102:1196): pid=3647 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:35.267543 kernel: audit: type=1100 audit(1707813275.102:1197): pid=3647 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:34:35.348383 kubelet[2614]: E0213 08:34:35.348358 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:35.605000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.605000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e39500 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:35.820377 kernel: audit: type=1400 audit(1707813275.605:1198): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.820431 kernel: audit: type=1300 audit(1707813275.605:1198): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e39500 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:35.820449 kernel: audit: type=1327 audit(1707813275.605:1198): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:35.605000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:35.606000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:36.006406 kernel: audit: type=1400 audit(1707813275.606:1199): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:36.006486 kernel: audit: type=1300 audit(1707813275.606:1199): arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0017af1a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:35.606000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c0017af1a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:36.127725 kernel: audit: type=1327 audit(1707813275.606:1199): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:35.606000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:36.221676 kernel: audit: type=1400 audit(1707813275.886:1200): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:36.313415 kernel: audit: type=1300 audit(1707813275.886:1200): arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011a9b830 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011a9b830 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:34:35.886000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.886000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00310b540 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.886000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:34:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0047b6a60 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:34:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c011955950 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:34:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0054e0f60 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:34:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c011955980 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:34:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:34:37.265100 kubelet[2614]: E0213 08:34:37.265055 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:37.671194 sshd[3647]: Failed password for root from 43.153.15.221 port 41886 ssh2 Feb 13 08:34:37.962150 sshd[3647]: Received disconnect from 43.153.15.221 port 41886:11: Bye Bye [preauth] Feb 13 08:34:37.962150 sshd[3647]: Disconnected from authenticating user root 43.153.15.221 port 41886 [preauth] Feb 13 08:34:37.964600 systemd[1]: sshd@14-145.40.67.89:22-43.153.15.221:41886.service: Deactivated successfully. Feb 13 08:34:37.964000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-145.40.67.89:22-43.153.15.221:41886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:34:39.265022 kubelet[2614]: E0213 08:34:39.264952 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:40.350477 kubelet[2614]: E0213 08:34:40.350373 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:40.945000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:40.990271 kernel: kauditd_printk_skb: 17 callbacks suppressed Feb 13 08:34:40.990347 kernel: audit: type=1400 audit(1707813280.945:1207): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:41.081962 kernel: audit: type=1300 audit(1707813280.945:1207): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0018cdc20 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:40.945000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0018cdc20 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:40.945000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:41.265115 kubelet[2614]: E0213 08:34:41.265104 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:41.301516 kernel: audit: type=1327 audit(1707813280.945:1207): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:41.301551 kernel: audit: type=1400 audit(1707813280.945:1208): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:40.945000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:40.945000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c000bfdac0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:41.518181 kernel: audit: type=1300 audit(1707813280.945:1208): arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c000bfdac0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:41.518210 kernel: audit: type=1327 audit(1707813280.945:1208): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:40.945000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:41.612387 kernel: audit: type=1400 audit(1707813280.947:1209): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:40.947000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:41.704377 kernel: audit: type=1300 audit(1707813280.947:1209): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0028feec0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:40.947000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0028feec0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:41.825839 kernel: audit: type=1327 audit(1707813280.947:1209): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:40.947000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:41.920215 kernel: audit: type=1400 audit(1707813280.950:1210): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:40.950000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:34:40.950000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0018cdd40 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:34:40.950000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:34:43.265124 kubelet[2614]: E0213 08:34:43.265074 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:45.265066 kubelet[2614]: E0213 08:34:45.265021 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:45.351685 kubelet[2614]: E0213 08:34:45.351614 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:47.264388 kubelet[2614]: E0213 08:34:47.264335 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:49.264250 kubelet[2614]: E0213 08:34:49.264206 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:50.353252 kubelet[2614]: E0213 08:34:50.353150 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:51.265027 kubelet[2614]: E0213 08:34:51.264981 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:53.265004 kubelet[2614]: E0213 08:34:53.264987 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:55.264252 kubelet[2614]: E0213 08:34:55.264192 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:55.296164 kubelet[2614]: E0213 08:34:55.296107 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.296164 kubelet[2614]: W0213 08:34:55.296125 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.296164 kubelet[2614]: E0213 08:34:55.296143 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.296367 kubelet[2614]: E0213 08:34:55.296359 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.296405 kubelet[2614]: W0213 08:34:55.296369 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.296405 kubelet[2614]: E0213 08:34:55.296383 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.296643 kubelet[2614]: E0213 08:34:55.296599 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.296643 kubelet[2614]: W0213 08:34:55.296610 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.296643 kubelet[2614]: E0213 08:34:55.296623 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.296857 kubelet[2614]: E0213 08:34:55.296845 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.296857 kubelet[2614]: W0213 08:34:55.296855 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.296947 kubelet[2614]: E0213 08:34:55.296866 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.297076 kubelet[2614]: E0213 08:34:55.297035 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.297076 kubelet[2614]: W0213 08:34:55.297046 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.297076 kubelet[2614]: E0213 08:34:55.297058 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.297289 kubelet[2614]: E0213 08:34:55.297250 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.297289 kubelet[2614]: W0213 08:34:55.297261 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.297289 kubelet[2614]: E0213 08:34:55.297273 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.297461 kubelet[2614]: E0213 08:34:55.297452 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.297505 kubelet[2614]: W0213 08:34:55.297462 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.297505 kubelet[2614]: E0213 08:34:55.297473 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.297626 kubelet[2614]: E0213 08:34:55.297619 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.297661 kubelet[2614]: W0213 08:34:55.297626 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.297661 kubelet[2614]: E0213 08:34:55.297636 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.297786 kubelet[2614]: E0213 08:34:55.297778 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.297831 kubelet[2614]: W0213 08:34:55.297786 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.297831 kubelet[2614]: E0213 08:34:55.297795 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.297940 kubelet[2614]: E0213 08:34:55.297932 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.297940 kubelet[2614]: W0213 08:34:55.297939 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298013 kubelet[2614]: E0213 08:34:55.297949 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298060 kubelet[2614]: E0213 08:34:55.298053 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298102 kubelet[2614]: W0213 08:34:55.298060 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298102 kubelet[2614]: E0213 08:34:55.298069 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298174 kubelet[2614]: E0213 08:34:55.298166 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298174 kubelet[2614]: W0213 08:34:55.298173 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298242 kubelet[2614]: E0213 08:34:55.298182 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298304 kubelet[2614]: E0213 08:34:55.298296 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298304 kubelet[2614]: W0213 08:34:55.298304 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298373 kubelet[2614]: E0213 08:34:55.298313 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298415 kubelet[2614]: E0213 08:34:55.298408 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298415 kubelet[2614]: W0213 08:34:55.298415 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298479 kubelet[2614]: E0213 08:34:55.298424 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298527 kubelet[2614]: E0213 08:34:55.298520 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298567 kubelet[2614]: W0213 08:34:55.298527 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298567 kubelet[2614]: E0213 08:34:55.298536 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298645 kubelet[2614]: E0213 08:34:55.298638 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298645 kubelet[2614]: W0213 08:34:55.298645 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298711 kubelet[2614]: E0213 08:34:55.298656 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298766 kubelet[2614]: E0213 08:34:55.298759 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298766 kubelet[2614]: W0213 08:34:55.298766 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298834 kubelet[2614]: E0213 08:34:55.298775 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.298880 kubelet[2614]: E0213 08:34:55.298873 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.298880 kubelet[2614]: W0213 08:34:55.298880 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.298989 kubelet[2614]: E0213 08:34:55.298890 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.299027 kubelet[2614]: E0213 08:34:55.298996 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.299027 kubelet[2614]: W0213 08:34:55.299002 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.299027 kubelet[2614]: E0213 08:34:55.299011 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.299158 kubelet[2614]: E0213 08:34:55.299150 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:34:55.299195 kubelet[2614]: W0213 08:34:55.299158 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:34:55.299195 kubelet[2614]: E0213 08:34:55.299168 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:34:55.355448 kubelet[2614]: E0213 08:34:55.355379 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:34:57.264881 kubelet[2614]: E0213 08:34:57.264831 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:34:59.265079 kubelet[2614]: E0213 08:34:59.265033 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:00.357634 kubelet[2614]: E0213 08:35:00.357429 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:01.265245 kubelet[2614]: E0213 08:35:01.265145 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:03.264462 kubelet[2614]: E0213 08:35:03.264382 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:05.264761 kubelet[2614]: E0213 08:35:05.264666 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:05.359300 kubelet[2614]: E0213 08:35:05.359210 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:07.264875 kubelet[2614]: E0213 08:35:07.264827 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:09.264869 kubelet[2614]: E0213 08:35:09.264823 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:10.360999 kubelet[2614]: E0213 08:35:10.360889 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:11.265035 kubelet[2614]: E0213 08:35:11.264984 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:13.265161 kubelet[2614]: E0213 08:35:13.265115 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:14.361553 kubelet[2614]: E0213 08:35:14.361440 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.361553 kubelet[2614]: W0213 08:35:14.361487 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.361553 kubelet[2614]: E0213 08:35:14.361533 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.362675 kubelet[2614]: E0213 08:35:14.362021 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.362675 kubelet[2614]: W0213 08:35:14.362049 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.362675 kubelet[2614]: E0213 08:35:14.362084 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.362675 kubelet[2614]: E0213 08:35:14.362561 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.362675 kubelet[2614]: W0213 08:35:14.362595 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.362675 kubelet[2614]: E0213 08:35:14.362634 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.363354 kubelet[2614]: E0213 08:35:14.363298 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.363354 kubelet[2614]: W0213 08:35:14.363332 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.363665 kubelet[2614]: E0213 08:35:14.363371 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.363952 kubelet[2614]: E0213 08:35:14.363891 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.364090 kubelet[2614]: W0213 08:35:14.363950 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.364090 kubelet[2614]: E0213 08:35:14.363995 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.364590 kubelet[2614]: E0213 08:35:14.364509 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.364590 kubelet[2614]: W0213 08:35:14.364544 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.364590 kubelet[2614]: E0213 08:35:14.364583 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.365256 kubelet[2614]: E0213 08:35:14.365174 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.365256 kubelet[2614]: W0213 08:35:14.365209 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.365256 kubelet[2614]: E0213 08:35:14.365249 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.365811 kubelet[2614]: E0213 08:35:14.365776 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.365962 kubelet[2614]: W0213 08:35:14.365813 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.365962 kubelet[2614]: E0213 08:35:14.365852 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.447669 kubelet[2614]: E0213 08:35:14.447543 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.447669 kubelet[2614]: W0213 08:35:14.447604 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.447669 kubelet[2614]: E0213 08:35:14.447660 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.448372 kubelet[2614]: E0213 08:35:14.448294 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.448372 kubelet[2614]: W0213 08:35:14.448324 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.448372 kubelet[2614]: E0213 08:35:14.448368 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.448880 kubelet[2614]: E0213 08:35:14.448850 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.449046 kubelet[2614]: W0213 08:35:14.448884 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.449046 kubelet[2614]: E0213 08:35:14.448950 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.449500 kubelet[2614]: E0213 08:35:14.449412 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.449500 kubelet[2614]: W0213 08:35:14.449438 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.449500 kubelet[2614]: E0213 08:35:14.449481 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.450057 kubelet[2614]: E0213 08:35:14.449988 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.450057 kubelet[2614]: W0213 08:35:14.450014 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.450057 kubelet[2614]: E0213 08:35:14.450058 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:14.450920 kubelet[2614]: E0213 08:35:14.450869 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:14.450920 kubelet[2614]: W0213 08:35:14.450901 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:14.451209 kubelet[2614]: E0213 08:35:14.450971 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.265277 kubelet[2614]: E0213 08:35:15.265210 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:15.273883 kubelet[2614]: E0213 08:35:15.273793 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.273883 kubelet[2614]: W0213 08:35:15.273832 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.273883 kubelet[2614]: E0213 08:35:15.273879 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.274489 kubelet[2614]: E0213 08:35:15.274409 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.274489 kubelet[2614]: W0213 08:35:15.274443 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.274489 kubelet[2614]: E0213 08:35:15.274483 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.275081 kubelet[2614]: E0213 08:35:15.275000 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.275081 kubelet[2614]: W0213 08:35:15.275027 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.275081 kubelet[2614]: E0213 08:35:15.275066 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.275710 kubelet[2614]: E0213 08:35:15.275629 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.275710 kubelet[2614]: W0213 08:35:15.275663 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.275710 kubelet[2614]: E0213 08:35:15.275703 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.276326 kubelet[2614]: E0213 08:35:15.276245 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.276326 kubelet[2614]: W0213 08:35:15.276280 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.276326 kubelet[2614]: E0213 08:35:15.276320 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.276873 kubelet[2614]: E0213 08:35:15.276840 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.277019 kubelet[2614]: W0213 08:35:15.276874 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.277019 kubelet[2614]: E0213 08:35:15.276914 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.277598 kubelet[2614]: E0213 08:35:15.277517 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.277598 kubelet[2614]: W0213 08:35:15.277551 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.277598 kubelet[2614]: E0213 08:35:15.277591 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.278208 kubelet[2614]: E0213 08:35:15.278127 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.278208 kubelet[2614]: W0213 08:35:15.278162 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.278208 kubelet[2614]: E0213 08:35:15.278205 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.278779 kubelet[2614]: E0213 08:35:15.278719 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.278779 kubelet[2614]: W0213 08:35:15.278754 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.279076 kubelet[2614]: E0213 08:35:15.278794 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.279428 kubelet[2614]: E0213 08:35:15.279348 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.279428 kubelet[2614]: W0213 08:35:15.279381 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.279428 kubelet[2614]: E0213 08:35:15.279422 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.279970 kubelet[2614]: E0213 08:35:15.279921 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.280116 kubelet[2614]: W0213 08:35:15.279970 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.280116 kubelet[2614]: E0213 08:35:15.280007 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.280583 kubelet[2614]: E0213 08:35:15.280512 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:15.280583 kubelet[2614]: W0213 08:35:15.280546 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:15.280583 kubelet[2614]: E0213 08:35:15.280584 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:15.363021 kubelet[2614]: E0213 08:35:15.362954 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:17.265056 kubelet[2614]: E0213 08:35:17.265004 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:19.265080 kubelet[2614]: E0213 08:35:19.265031 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:20.364878 kubelet[2614]: E0213 08:35:20.364777 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:21.264645 kubelet[2614]: E0213 08:35:21.264544 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:23.264795 kubelet[2614]: E0213 08:35:23.264749 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:25.265057 kubelet[2614]: E0213 08:35:25.265040 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:25.367127 kubelet[2614]: E0213 08:35:25.367074 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:27.265113 kubelet[2614]: E0213 08:35:27.264997 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:28.507384 systemd[1]: Started sshd@15-145.40.67.89:22-161.35.108.241:34734.service. Feb 13 08:35:28.506000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.67.89:22-161.35.108.241:34734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:28.535061 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:35:28.535138 kernel: audit: type=1130 audit(1707813328.506:1211): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.67.89:22-161.35.108.241:34734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:28.984044 sshd[3704]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:35:28.983000 audit[3704]: ANOM_LOGIN_FAILURES pid=3704 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:28.984294 sshd[3704]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:35:28.983000 audit[3704]: USER_AUTH pid=3704 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:35:29.146278 kernel: audit: type=2100 audit(1707813328.983:1212): pid=3704 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:29.146309 kernel: audit: type=1100 audit(1707813328.983:1213): pid=3704 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:35:29.264868 kubelet[2614]: E0213 08:35:29.264790 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:30.307062 kubelet[2614]: E0213 08:35:30.306963 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:30.307062 kubelet[2614]: W0213 08:35:30.307007 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:30.307062 kubelet[2614]: E0213 08:35:30.307062 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:30.308316 kubelet[2614]: E0213 08:35:30.307579 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:30.308316 kubelet[2614]: W0213 08:35:30.307611 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:30.308316 kubelet[2614]: E0213 08:35:30.307650 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:30.308316 kubelet[2614]: E0213 08:35:30.308205 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:30.308731 kubelet[2614]: W0213 08:35:30.308290 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:30.308731 kubelet[2614]: E0213 08:35:30.308380 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:30.309023 kubelet[2614]: E0213 08:35:30.308918 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:30.309023 kubelet[2614]: W0213 08:35:30.308967 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:30.309023 kubelet[2614]: E0213 08:35:30.309007 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:30.369097 kubelet[2614]: E0213 08:35:30.369042 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:30.695701 systemd[1]: Started sshd@16-145.40.67.89:22-139.178.68.195:39908.service. Feb 13 08:35:30.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.67.89:22-139.178.68.195:39908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:30.785936 kernel: audit: type=1130 audit(1707813330.694:1214): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.67.89:22-139.178.68.195:39908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:30.813000 audit[3711]: USER_ACCT pid=3711 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.813992 sshd[3711]: Accepted publickey for core from 139.178.68.195 port 39908 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:35:30.817237 sshd[3711]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:35:30.820317 systemd-logind[1463]: New session 10 of user core. Feb 13 08:35:30.820794 systemd[1]: Started session-10.scope. Feb 13 08:35:30.829022 sshd[3704]: Failed password for root from 161.35.108.241 port 34734 ssh2 Feb 13 08:35:30.816000 audit[3711]: CRED_ACQ pid=3711 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.907218 sshd[3711]: pam_unix(sshd:session): session closed for user core Feb 13 08:35:30.908660 systemd[1]: sshd@16-145.40.67.89:22-139.178.68.195:39908.service: Deactivated successfully. Feb 13 08:35:30.909070 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 08:35:30.909445 systemd-logind[1463]: Session 10 logged out. Waiting for processes to exit. Feb 13 08:35:30.909861 systemd-logind[1463]: Removed session 10. Feb 13 08:35:30.998335 kernel: audit: type=1101 audit(1707813330.813:1215): pid=3711 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.998373 kernel: audit: type=1103 audit(1707813330.816:1216): pid=3711 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.998391 kernel: audit: type=1006 audit(1707813330.816:1217): pid=3711 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Feb 13 08:35:31.057531 kernel: audit: type=1300 audit(1707813330.816:1217): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8a1d35f0 a2=3 a3=0 items=0 ppid=1 pid=3711 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:30.816000 audit[3711]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff8a1d35f0 a2=3 a3=0 items=0 ppid=1 pid=3711 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:31.150474 kernel: audit: type=1327 audit(1707813330.816:1217): proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:30.816000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:31.181479 kernel: audit: type=1105 audit(1707813330.822:1218): pid=3711 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.822000 audit[3711]: USER_START pid=3711 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:31.264360 kubelet[2614]: E0213 08:35:31.264317 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:30.822000 audit[3713]: CRED_ACQ pid=3713 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.907000 audit[3711]: USER_END pid=3711 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.907000 audit[3711]: CRED_DISP pid=3711 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:30.907000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-145.40.67.89:22-139.178.68.195:39908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:31.911852 sshd[3704]: Received disconnect from 161.35.108.241 port 34734:11: Bye Bye [preauth] Feb 13 08:35:31.911852 sshd[3704]: Disconnected from authenticating user root 161.35.108.241 port 34734 [preauth] Feb 13 08:35:31.914632 systemd[1]: sshd@15-145.40.67.89:22-161.35.108.241:34734.service: Deactivated successfully. Feb 13 08:35:31.914000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-145.40.67.89:22-161.35.108.241:34734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:32.696767 systemd[1]: Started sshd@17-145.40.67.89:22-43.153.15.221:60730.service. Feb 13 08:35:32.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.67.89:22-43.153.15.221:60730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:32.840374 sshd[3738]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:35:32.839000 audit[3738]: USER_AUTH pid=3738 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:35:33.265736 kubelet[2614]: E0213 08:35:33.265618 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:34.565153 sshd[3738]: Failed password for root from 43.153.15.221 port 60730 ssh2 Feb 13 08:35:35.265100 kubelet[2614]: E0213 08:35:35.265005 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:35.370770 kubelet[2614]: E0213 08:35:35.370677 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:35.607000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.639281 kernel: kauditd_printk_skb: 7 callbacks suppressed Feb 13 08:35:35.639342 kernel: audit: type=1400 audit(1707813335.607:1226): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.607000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001023ad0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:35.731056 sshd[3738]: Received disconnect from 43.153.15.221 port 60730:11: Bye Bye [preauth] Feb 13 08:35:35.731056 sshd[3738]: Disconnected from authenticating user root 43.153.15.221 port 60730 [preauth] Feb 13 08:35:35.731649 systemd[1]: sshd@17-145.40.67.89:22-43.153.15.221:60730.service: Deactivated successfully. Feb 13 08:35:35.850666 kernel: audit: type=1300 audit(1707813335.607:1226): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001023ad0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:35.850707 kernel: audit: type=1327 audit(1707813335.607:1226): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:35.607000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:35.909253 systemd[1]: Started sshd@18-145.40.67.89:22-139.178.68.195:39914.service. Feb 13 08:35:35.943469 kernel: audit: type=1400 audit(1707813335.607:1227): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.607000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.607000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00216ef80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:36.033973 kernel: audit: type=1300 audit(1707813335.607:1227): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00216ef80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:36.061534 sshd[3742]: Accepted publickey for core from 139.178.68.195 port 39914 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:35:36.062936 sshd[3742]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:35:36.065566 systemd-logind[1463]: New session 11 of user core. Feb 13 08:35:36.066665 systemd[1]: Started session-11.scope. Feb 13 08:35:36.145473 sshd[3742]: pam_unix(sshd:session): session closed for user core Feb 13 08:35:36.146989 systemd[1]: sshd@18-145.40.67.89:22-139.178.68.195:39914.service: Deactivated successfully. Feb 13 08:35:36.147756 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 08:35:36.148308 systemd-logind[1463]: Session 11 logged out. Waiting for processes to exit. Feb 13 08:35:36.148767 systemd-logind[1463]: Removed session 11. Feb 13 08:35:35.607000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:36.245877 kernel: audit: type=1327 audit(1707813335.607:1227): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:36.245939 kernel: audit: type=1131 audit(1707813335.730:1228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.67.89:22-43.153.15.221:60730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:35.730000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-145.40.67.89:22-43.153.15.221:60730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:36.335219 kernel: audit: type=1400 audit(1707813335.889:1229): avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:36.426678 kernel: audit: type=1300 audit(1707813335.889:1229): arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0114c0060 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0114c0060 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:36.618843 kernel: audit: type=1327 audit(1707813335.889:1229): proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:35.889000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.889000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c00385ba40 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.889000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0114c00c0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0114c0120 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0079c4300 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c002f129c0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:35:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:35:35.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.67.89:22-139.178.68.195:39914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:36.060000 audit[3742]: USER_ACCT pid=3742 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:36.061000 audit[3742]: CRED_ACQ pid=3742 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:36.061000 audit[3742]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe4c6f1a0 a2=3 a3=0 items=0 ppid=1 pid=3742 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:36.061000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:36.069000 audit[3742]: USER_START pid=3742 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:36.069000 audit[3745]: CRED_ACQ pid=3745 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:36.145000 audit[3742]: USER_END pid=3742 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:36.145000 audit[3742]: CRED_DISP pid=3742 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:36.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-145.40.67.89:22-139.178.68.195:39914 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:37.264873 kubelet[2614]: E0213 08:35:37.264853 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:39.265249 kubelet[2614]: E0213 08:35:39.265202 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:40.372847 kubelet[2614]: E0213 08:35:40.372749 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:40.946000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:40.975363 kernel: kauditd_printk_skb: 26 callbacks suppressed Feb 13 08:35:40.975452 kernel: audit: type=1400 audit(1707813340.946:1244): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:40.946000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b445c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:41.064976 kernel: audit: type=1300 audit(1707813340.946:1244): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b445c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:41.148497 systemd[1]: Started sshd@19-145.40.67.89:22-139.178.68.195:36128.service. Feb 13 08:35:40.946000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:41.265167 kubelet[2614]: E0213 08:35:41.265105 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:41.278570 kernel: audit: type=1327 audit(1707813340.946:1244): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:41.278604 kernel: audit: type=1400 audit(1707813340.946:1245): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:40.946000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:41.307386 sshd[3771]: Accepted publickey for core from 139.178.68.195 port 36128 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:35:41.308198 sshd[3771]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:35:41.310597 systemd-logind[1463]: New session 12 of user core. Feb 13 08:35:41.311105 systemd[1]: Started session-12.scope. Feb 13 08:35:40.946000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002a1a760 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:41.369013 kernel: audit: type=1300 audit(1707813340.946:1245): arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002a1a760 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:41.387999 sshd[3771]: pam_unix(sshd:session): session closed for user core Feb 13 08:35:41.389481 systemd[1]: sshd@19-145.40.67.89:22-139.178.68.195:36128.service: Deactivated successfully. Feb 13 08:35:41.389902 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 08:35:41.390286 systemd-logind[1463]: Session 12 logged out. Waiting for processes to exit. Feb 13 08:35:41.390755 systemd-logind[1463]: Removed session 12. Feb 13 08:35:40.946000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:41.489993 kernel: audit: type=1327 audit(1707813340.946:1245): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:40.948000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:41.673702 kernel: audit: type=1400 audit(1707813340.948:1246): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:41.673728 kernel: audit: type=1300 audit(1707813340.948:1246): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002354740 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:40.948000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002354740 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:41.794644 kernel: audit: type=1327 audit(1707813340.948:1246): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:40.948000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:41.888178 kernel: audit: type=1400 audit(1707813340.950:1247): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:40.950000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:35:40.950000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002a1a880 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:35:40.950000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:35:41.147000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.67.89:22-139.178.68.195:36128 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:41.306000 audit[3771]: USER_ACCT pid=3771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:41.307000 audit[3771]: CRED_ACQ pid=3771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:41.307000 audit[3771]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe00290c50 a2=3 a3=0 items=0 ppid=1 pid=3771 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:41.307000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:41.312000 audit[3771]: USER_START pid=3771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:41.312000 audit[3773]: CRED_ACQ pid=3773 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:41.387000 audit[3771]: USER_END pid=3771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:41.387000 audit[3771]: CRED_DISP pid=3771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:41.388000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-145.40.67.89:22-139.178.68.195:36128 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:42.295745 kubelet[2614]: E0213 08:35:42.295690 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.295745 kubelet[2614]: W0213 08:35:42.295710 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.295745 kubelet[2614]: E0213 08:35:42.295731 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.296157 kubelet[2614]: E0213 08:35:42.295891 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.296157 kubelet[2614]: W0213 08:35:42.295899 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.296157 kubelet[2614]: E0213 08:35:42.295911 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.296157 kubelet[2614]: E0213 08:35:42.296094 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.296157 kubelet[2614]: W0213 08:35:42.296105 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.296157 kubelet[2614]: E0213 08:35:42.296121 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.296372 kubelet[2614]: E0213 08:35:42.296354 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.296372 kubelet[2614]: W0213 08:35:42.296366 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.296459 kubelet[2614]: E0213 08:35:42.296380 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.296594 kubelet[2614]: E0213 08:35:42.296555 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.296594 kubelet[2614]: W0213 08:35:42.296564 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.296594 kubelet[2614]: E0213 08:35:42.296575 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.296737 kubelet[2614]: E0213 08:35:42.296709 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.296737 kubelet[2614]: W0213 08:35:42.296717 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.296737 kubelet[2614]: E0213 08:35:42.296727 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.296899 kubelet[2614]: E0213 08:35:42.296889 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.296899 kubelet[2614]: W0213 08:35:42.296898 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.296979 kubelet[2614]: E0213 08:35:42.296909 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297045 kubelet[2614]: E0213 08:35:42.297034 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297045 kubelet[2614]: W0213 08:35:42.297043 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.297122 kubelet[2614]: E0213 08:35:42.297054 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297243 kubelet[2614]: E0213 08:35:42.297204 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297243 kubelet[2614]: W0213 08:35:42.297212 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.297243 kubelet[2614]: E0213 08:35:42.297222 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297464 kubelet[2614]: E0213 08:35:42.297426 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297464 kubelet[2614]: W0213 08:35:42.297434 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.297464 kubelet[2614]: E0213 08:35:42.297444 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297605 kubelet[2614]: E0213 08:35:42.297553 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297605 kubelet[2614]: W0213 08:35:42.297561 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.297605 kubelet[2614]: E0213 08:35:42.297571 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297704 kubelet[2614]: E0213 08:35:42.297686 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297704 kubelet[2614]: W0213 08:35:42.297693 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.297778 kubelet[2614]: E0213 08:35:42.297705 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297833 kubelet[2614]: E0213 08:35:42.297824 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297833 kubelet[2614]: W0213 08:35:42.297832 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.297904 kubelet[2614]: E0213 08:35:42.297843 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.297977 kubelet[2614]: E0213 08:35:42.297968 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.297977 kubelet[2614]: W0213 08:35:42.297976 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.298057 kubelet[2614]: E0213 08:35:42.297986 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.298110 kubelet[2614]: E0213 08:35:42.298101 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.298150 kubelet[2614]: W0213 08:35:42.298109 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.298150 kubelet[2614]: E0213 08:35:42.298119 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.298319 kubelet[2614]: E0213 08:35:42.298309 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.298319 kubelet[2614]: W0213 08:35:42.298318 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.298402 kubelet[2614]: E0213 08:35:42.298329 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.355229 kubelet[2614]: E0213 08:35:42.355116 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.355229 kubelet[2614]: W0213 08:35:42.355174 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.355229 kubelet[2614]: E0213 08:35:42.355226 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.355857 kubelet[2614]: E0213 08:35:42.355777 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.355857 kubelet[2614]: W0213 08:35:42.355805 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.355857 kubelet[2614]: E0213 08:35:42.355851 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.356512 kubelet[2614]: E0213 08:35:42.356433 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.356512 kubelet[2614]: W0213 08:35:42.356473 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.356852 kubelet[2614]: E0213 08:35:42.356533 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.357089 kubelet[2614]: E0213 08:35:42.357024 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.357089 kubelet[2614]: W0213 08:35:42.357050 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.357089 kubelet[2614]: E0213 08:35:42.357089 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.357586 kubelet[2614]: E0213 08:35:42.357510 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.357586 kubelet[2614]: W0213 08:35:42.357541 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.357871 kubelet[2614]: E0213 08:35:42.357677 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.358122 kubelet[2614]: E0213 08:35:42.358051 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.358122 kubelet[2614]: W0213 08:35:42.358081 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.358122 kubelet[2614]: E0213 08:35:42.358121 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.358663 kubelet[2614]: E0213 08:35:42.358587 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.358663 kubelet[2614]: W0213 08:35:42.358619 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.358663 kubelet[2614]: E0213 08:35:42.358663 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.359268 kubelet[2614]: E0213 08:35:42.359188 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.359268 kubelet[2614]: W0213 08:35:42.359220 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.359268 kubelet[2614]: E0213 08:35:42.359265 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.359757 kubelet[2614]: E0213 08:35:42.359704 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.359757 kubelet[2614]: W0213 08:35:42.359728 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.360004 kubelet[2614]: E0213 08:35:42.359820 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.360255 kubelet[2614]: E0213 08:35:42.360225 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.360376 kubelet[2614]: W0213 08:35:42.360258 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.360376 kubelet[2614]: E0213 08:35:42.360294 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.360752 kubelet[2614]: E0213 08:35:42.360692 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.360752 kubelet[2614]: W0213 08:35:42.360727 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.360988 kubelet[2614]: E0213 08:35:42.360772 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:42.361373 kubelet[2614]: E0213 08:35:42.361298 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:35:42.361373 kubelet[2614]: W0213 08:35:42.361331 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:35:42.361373 kubelet[2614]: E0213 08:35:42.361368 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:35:43.264779 kubelet[2614]: E0213 08:35:43.264678 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:45.264372 kubelet[2614]: E0213 08:35:45.264312 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:45.374402 kubelet[2614]: E0213 08:35:45.374304 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:46.397907 systemd[1]: Started sshd@20-145.40.67.89:22-139.178.68.195:38110.service. Feb 13 08:35:46.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.67.89:22-139.178.68.195:38110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:46.425266 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:35:46.425327 kernel: audit: type=1130 audit(1707813346.397:1257): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.67.89:22-139.178.68.195:38110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:46.544000 audit[3826]: USER_ACCT pid=3826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.544897 sshd[3826]: Accepted publickey for core from 139.178.68.195 port 38110 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:35:46.546224 sshd[3826]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:35:46.548932 systemd-logind[1463]: New session 13 of user core. Feb 13 08:35:46.550052 systemd[1]: Started session-13.scope. Feb 13 08:35:46.630241 sshd[3826]: pam_unix(sshd:session): session closed for user core Feb 13 08:35:46.631681 systemd[1]: sshd@20-145.40.67.89:22-139.178.68.195:38110.service: Deactivated successfully. Feb 13 08:35:46.632108 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 08:35:46.632508 systemd-logind[1463]: Session 13 logged out. Waiting for processes to exit. Feb 13 08:35:46.632879 systemd-logind[1463]: Removed session 13. Feb 13 08:35:46.545000 audit[3826]: CRED_ACQ pid=3826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.727752 kernel: audit: type=1101 audit(1707813346.544:1258): pid=3826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.727792 kernel: audit: type=1103 audit(1707813346.545:1259): pid=3826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.727810 kernel: audit: type=1006 audit(1707813346.545:1260): pid=3826 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Feb 13 08:35:46.786497 kernel: audit: type=1300 audit(1707813346.545:1260): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe92f1ca50 a2=3 a3=0 items=0 ppid=1 pid=3826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:46.545000 audit[3826]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe92f1ca50 a2=3 a3=0 items=0 ppid=1 pid=3826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:46.879263 kernel: audit: type=1327 audit(1707813346.545:1260): proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:46.545000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:46.910124 kernel: audit: type=1105 audit(1707813346.551:1261): pid=3826 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.551000 audit[3826]: USER_START pid=3826 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:47.005704 kernel: audit: type=1103 audit(1707813346.552:1262): pid=3828 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.552000 audit[3828]: CRED_ACQ pid=3828 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:47.095054 kernel: audit: type=1106 audit(1707813346.630:1263): pid=3826 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.630000 audit[3826]: USER_END pid=3826 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:47.190863 kernel: audit: type=1104 audit(1707813346.630:1264): pid=3826 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:46.630000 audit[3826]: CRED_DISP pid=3826 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:47.264850 kubelet[2614]: E0213 08:35:47.264813 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:46.630000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-145.40.67.89:22-139.178.68.195:38110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:49.265514 kubelet[2614]: E0213 08:35:49.265418 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:50.375639 kubelet[2614]: E0213 08:35:50.375549 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:51.264781 kubelet[2614]: E0213 08:35:51.264678 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:51.641436 systemd[1]: Started sshd@21-145.40.67.89:22-139.178.68.195:38118.service. Feb 13 08:35:51.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.67.89:22-139.178.68.195:38118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:51.669078 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:35:51.669177 kernel: audit: type=1130 audit(1707813351.640:1266): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.67.89:22-139.178.68.195:38118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:51.788000 audit[3851]: USER_ACCT pid=3851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.789183 sshd[3851]: Accepted publickey for core from 139.178.68.195 port 38118 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:35:51.793199 sshd[3851]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:35:51.802853 systemd-logind[1463]: New session 14 of user core. Feb 13 08:35:51.805289 systemd[1]: Started session-14.scope. Feb 13 08:35:51.791000 audit[3851]: CRED_ACQ pid=3851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.893253 sshd[3851]: pam_unix(sshd:session): session closed for user core Feb 13 08:35:51.894718 systemd[1]: sshd@21-145.40.67.89:22-139.178.68.195:38118.service: Deactivated successfully. Feb 13 08:35:51.895178 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 08:35:51.895623 systemd-logind[1463]: Session 14 logged out. Waiting for processes to exit. Feb 13 08:35:51.896264 systemd-logind[1463]: Removed session 14. Feb 13 08:35:51.972396 kernel: audit: type=1101 audit(1707813351.788:1267): pid=3851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.972425 kernel: audit: type=1103 audit(1707813351.791:1268): pid=3851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.972443 kernel: audit: type=1006 audit(1707813351.791:1269): pid=3851 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Feb 13 08:35:52.030826 kernel: audit: type=1300 audit(1707813351.791:1269): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd27998ce0 a2=3 a3=0 items=0 ppid=1 pid=3851 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:51.791000 audit[3851]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd27998ce0 a2=3 a3=0 items=0 ppid=1 pid=3851 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:52.122642 kernel: audit: type=1327 audit(1707813351.791:1269): proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:51.791000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:52.153161 kernel: audit: type=1105 audit(1707813351.815:1270): pid=3851 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.815000 audit[3851]: USER_START pid=3851 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:52.247738 kernel: audit: type=1103 audit(1707813351.816:1271): pid=3853 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.816000 audit[3853]: CRED_ACQ pid=3853 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:52.337085 kernel: audit: type=1106 audit(1707813351.893:1272): pid=3851 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.893000 audit[3851]: USER_END pid=3851 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:52.432690 kernel: audit: type=1104 audit(1707813351.893:1273): pid=3851 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.893000 audit[3851]: CRED_DISP pid=3851 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:51.893000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-145.40.67.89:22-139.178.68.195:38118 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:53.264841 kubelet[2614]: E0213 08:35:53.264787 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:55.264906 kubelet[2614]: E0213 08:35:55.264889 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:55.377834 kubelet[2614]: E0213 08:35:55.377763 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:35:56.902832 systemd[1]: Started sshd@22-145.40.67.89:22-139.178.68.195:56196.service. Feb 13 08:35:56.900000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.67.89:22-139.178.68.195:56196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:56.929922 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:35:56.929996 kernel: audit: type=1130 audit(1707813356.900:1275): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.67.89:22-139.178.68.195:56196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:57.047000 audit[3877]: USER_ACCT pid=3877 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.048373 sshd[3877]: Accepted publickey for core from 139.178.68.195 port 56196 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:35:57.049877 sshd[3877]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:35:57.052149 systemd-logind[1463]: New session 15 of user core. Feb 13 08:35:57.052687 systemd[1]: Started session-15.scope. Feb 13 08:35:57.130907 sshd[3877]: pam_unix(sshd:session): session closed for user core Feb 13 08:35:57.132342 systemd[1]: sshd@22-145.40.67.89:22-139.178.68.195:56196.service: Deactivated successfully. Feb 13 08:35:57.132753 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 08:35:57.133146 systemd-logind[1463]: Session 15 logged out. Waiting for processes to exit. Feb 13 08:35:57.133648 systemd-logind[1463]: Removed session 15. Feb 13 08:35:57.047000 audit[3877]: CRED_ACQ pid=3877 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.230313 kernel: audit: type=1101 audit(1707813357.047:1276): pid=3877 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.230350 kernel: audit: type=1103 audit(1707813357.047:1277): pid=3877 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.230364 kernel: audit: type=1006 audit(1707813357.047:1278): pid=3877 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Feb 13 08:35:57.264219 kubelet[2614]: E0213 08:35:57.264193 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:35:57.289010 kernel: audit: type=1300 audit(1707813357.047:1278): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc5209210 a2=3 a3=0 items=0 ppid=1 pid=3877 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:57.047000 audit[3877]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc5209210 a2=3 a3=0 items=0 ppid=1 pid=3877 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:35:57.381013 kernel: audit: type=1327 audit(1707813357.047:1278): proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:57.047000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:35:57.411558 kernel: audit: type=1105 audit(1707813357.052:1279): pid=3877 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.052000 audit[3877]: USER_START pid=3877 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.053000 audit[3879]: CRED_ACQ pid=3879 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.595606 kernel: audit: type=1103 audit(1707813357.053:1280): pid=3879 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.595639 kernel: audit: type=1106 audit(1707813357.129:1281): pid=3877 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.129000 audit[3877]: USER_END pid=3877 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.691135 kernel: audit: type=1104 audit(1707813357.129:1282): pid=3877 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.129000 audit[3877]: CRED_DISP pid=3877 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:35:57.130000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-145.40.67.89:22-139.178.68.195:56196 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:35:59.264834 kubelet[2614]: E0213 08:35:59.264760 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:00.379970 kubelet[2614]: E0213 08:36:00.379860 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:01.264215 kubelet[2614]: E0213 08:36:01.264170 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:02.140776 systemd[1]: Started sshd@23-145.40.67.89:22-139.178.68.195:56198.service. Feb 13 08:36:02.138000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.67.89:22-139.178.68.195:56198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:02.167113 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:36:02.167143 kernel: audit: type=1130 audit(1707813362.138:1284): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.67.89:22-139.178.68.195:56198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:02.282000 audit[3903]: USER_ACCT pid=3903 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.283842 sshd[3903]: Accepted publickey for core from 139.178.68.195 port 56198 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:02.285249 sshd[3903]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:02.287503 systemd-logind[1463]: New session 16 of user core. Feb 13 08:36:02.288229 systemd[1]: Started session-16.scope. Feb 13 08:36:02.365814 sshd[3903]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:02.367265 systemd[1]: sshd@23-145.40.67.89:22-139.178.68.195:56198.service: Deactivated successfully. Feb 13 08:36:02.367685 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 08:36:02.367989 systemd-logind[1463]: Session 16 logged out. Waiting for processes to exit. Feb 13 08:36:02.368414 systemd-logind[1463]: Removed session 16. Feb 13 08:36:02.283000 audit[3903]: CRED_ACQ pid=3903 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.465686 kernel: audit: type=1101 audit(1707813362.282:1285): pid=3903 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.465728 kernel: audit: type=1103 audit(1707813362.283:1286): pid=3903 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.465750 kernel: audit: type=1006 audit(1707813362.283:1287): pid=3903 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Feb 13 08:36:02.524333 kernel: audit: type=1300 audit(1707813362.283:1287): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca6254230 a2=3 a3=0 items=0 ppid=1 pid=3903 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:02.283000 audit[3903]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca6254230 a2=3 a3=0 items=0 ppid=1 pid=3903 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:02.616478 kernel: audit: type=1327 audit(1707813362.283:1287): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:02.283000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:02.647014 kernel: audit: type=1105 audit(1707813362.288:1288): pid=3903 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.288000 audit[3903]: USER_START pid=3903 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.741527 kernel: audit: type=1103 audit(1707813362.289:1289): pid=3905 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.289000 audit[3905]: CRED_ACQ pid=3905 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.830761 kernel: audit: type=1106 audit(1707813362.364:1290): pid=3903 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.364000 audit[3903]: USER_END pid=3903 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.926358 kernel: audit: type=1104 audit(1707813362.364:1291): pid=3903 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.364000 audit[3903]: CRED_DISP pid=3903 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:02.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-145.40.67.89:22-139.178.68.195:56198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:03.265090 kubelet[2614]: E0213 08:36:03.265043 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:03.365992 kubelet[2614]: E0213 08:36:03.365922 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.365992 kubelet[2614]: W0213 08:36:03.365980 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.366341 kubelet[2614]: E0213 08:36:03.366026 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.366516 kubelet[2614]: E0213 08:36:03.366481 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.366516 kubelet[2614]: W0213 08:36:03.366516 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.366755 kubelet[2614]: E0213 08:36:03.366552 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.366996 kubelet[2614]: E0213 08:36:03.366970 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.366996 kubelet[2614]: W0213 08:36:03.366996 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.367199 kubelet[2614]: E0213 08:36:03.367027 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.367461 kubelet[2614]: E0213 08:36:03.367434 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.367565 kubelet[2614]: W0213 08:36:03.367468 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.367565 kubelet[2614]: E0213 08:36:03.367506 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.367867 kubelet[2614]: E0213 08:36:03.367843 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.367867 kubelet[2614]: W0213 08:36:03.367865 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.368099 kubelet[2614]: E0213 08:36:03.367893 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.368321 kubelet[2614]: E0213 08:36:03.368297 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.368446 kubelet[2614]: W0213 08:36:03.368320 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.368446 kubelet[2614]: E0213 08:36:03.368350 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.368824 kubelet[2614]: E0213 08:36:03.368800 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.368947 kubelet[2614]: W0213 08:36:03.368824 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.368947 kubelet[2614]: E0213 08:36:03.368853 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.369317 kubelet[2614]: E0213 08:36:03.369273 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.369317 kubelet[2614]: W0213 08:36:03.369295 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.369317 kubelet[2614]: E0213 08:36:03.369324 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.369722 kubelet[2614]: E0213 08:36:03.369698 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.369722 kubelet[2614]: W0213 08:36:03.369720 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.369962 kubelet[2614]: E0213 08:36:03.369749 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.370234 kubelet[2614]: E0213 08:36:03.370204 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.370342 kubelet[2614]: W0213 08:36:03.370236 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.370342 kubelet[2614]: E0213 08:36:03.370272 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.370757 kubelet[2614]: E0213 08:36:03.370712 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.370757 kubelet[2614]: W0213 08:36:03.370735 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.371005 kubelet[2614]: E0213 08:36:03.370767 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.371297 kubelet[2614]: E0213 08:36:03.371221 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.371297 kubelet[2614]: W0213 08:36:03.371252 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.371297 kubelet[2614]: E0213 08:36:03.371287 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.371784 kubelet[2614]: E0213 08:36:03.371758 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.371784 kubelet[2614]: W0213 08:36:03.371782 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.372015 kubelet[2614]: E0213 08:36:03.371817 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.372336 kubelet[2614]: E0213 08:36:03.372259 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.372336 kubelet[2614]: W0213 08:36:03.372290 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.372336 kubelet[2614]: E0213 08:36:03.372327 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.372731 kubelet[2614]: E0213 08:36:03.372703 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.372731 kubelet[2614]: W0213 08:36:03.372723 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.372963 kubelet[2614]: E0213 08:36:03.372754 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.373265 kubelet[2614]: E0213 08:36:03.373196 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.373265 kubelet[2614]: W0213 08:36:03.373228 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.373265 kubelet[2614]: E0213 08:36:03.373262 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.373692 kubelet[2614]: E0213 08:36:03.373664 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.373692 kubelet[2614]: W0213 08:36:03.373689 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.373896 kubelet[2614]: E0213 08:36:03.373721 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.374148 kubelet[2614]: E0213 08:36:03.374096 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.374148 kubelet[2614]: W0213 08:36:03.374118 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.374148 kubelet[2614]: E0213 08:36:03.374148 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.374525 kubelet[2614]: E0213 08:36:03.374499 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.374525 kubelet[2614]: W0213 08:36:03.374524 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.374714 kubelet[2614]: E0213 08:36:03.374552 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:03.374865 kubelet[2614]: E0213 08:36:03.374844 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:03.374986 kubelet[2614]: W0213 08:36:03.374865 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:03.374986 kubelet[2614]: E0213 08:36:03.374891 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:05.265610 kubelet[2614]: E0213 08:36:05.265501 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:05.381269 kubelet[2614]: E0213 08:36:05.381175 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:07.265436 kubelet[2614]: E0213 08:36:07.265377 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:07.375253 systemd[1]: Started sshd@24-145.40.67.89:22-139.178.68.195:56080.service. Feb 13 08:36:07.373000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.67.89:22-139.178.68.195:56080 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:07.402464 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:36:07.402542 kernel: audit: type=1130 audit(1707813367.373:1293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.67.89:22-139.178.68.195:56080 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:07.520000 audit[3949]: USER_ACCT pid=3949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.522853 sshd[3949]: Accepted publickey for core from 139.178.68.195 port 56080 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:07.527234 sshd[3949]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:07.531363 systemd-logind[1463]: New session 17 of user core. Feb 13 08:36:07.531921 systemd[1]: Started session-17.scope. Feb 13 08:36:07.609150 sshd[3949]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:07.610572 systemd[1]: sshd@24-145.40.67.89:22-139.178.68.195:56080.service: Deactivated successfully. Feb 13 08:36:07.611007 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 08:36:07.611362 systemd-logind[1463]: Session 17 logged out. Waiting for processes to exit. Feb 13 08:36:07.611783 systemd-logind[1463]: Removed session 17. Feb 13 08:36:07.524000 audit[3949]: CRED_ACQ pid=3949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.704478 kernel: audit: type=1101 audit(1707813367.520:1294): pid=3949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.704520 kernel: audit: type=1103 audit(1707813367.524:1295): pid=3949 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.704539 kernel: audit: type=1006 audit(1707813367.524:1296): pid=3949 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Feb 13 08:36:07.763070 kernel: audit: type=1300 audit(1707813367.524:1296): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc93f87100 a2=3 a3=0 items=0 ppid=1 pid=3949 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:07.524000 audit[3949]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc93f87100 a2=3 a3=0 items=0 ppid=1 pid=3949 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:07.855068 kernel: audit: type=1327 audit(1707813367.524:1296): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:07.524000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:07.885566 kernel: audit: type=1105 audit(1707813367.531:1297): pid=3949 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.531000 audit[3949]: USER_START pid=3949 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.980042 kernel: audit: type=1103 audit(1707813367.532:1298): pid=3951 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.532000 audit[3951]: CRED_ACQ pid=3951 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:08.069297 kernel: audit: type=1106 audit(1707813367.607:1299): pid=3949 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.607000 audit[3949]: USER_END pid=3949 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.607000 audit[3949]: CRED_DISP pid=3949 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:08.254167 kernel: audit: type=1104 audit(1707813367.607:1300): pid=3949 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:07.608000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-145.40.67.89:22-139.178.68.195:56080 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:09.264156 kubelet[2614]: E0213 08:36:09.264110 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:10.382684 kubelet[2614]: E0213 08:36:10.382581 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:11.264900 kubelet[2614]: E0213 08:36:11.264853 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:12.618769 systemd[1]: Started sshd@25-145.40.67.89:22-139.178.68.195:56090.service. Feb 13 08:36:12.617000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.67.89:22-139.178.68.195:56090 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:12.645769 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:36:12.645818 kernel: audit: type=1130 audit(1707813372.617:1302): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.67.89:22-139.178.68.195:56090 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:12.762000 audit[3976]: USER_ACCT pid=3976 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.764004 sshd[3976]: Accepted publickey for core from 139.178.68.195 port 56090 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:12.765220 sshd[3976]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:12.767575 systemd-logind[1463]: New session 18 of user core. Feb 13 08:36:12.768123 systemd[1]: Started session-18.scope. Feb 13 08:36:12.847792 sshd[3976]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:12.849415 systemd[1]: sshd@25-145.40.67.89:22-139.178.68.195:56090.service: Deactivated successfully. Feb 13 08:36:12.849918 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 08:36:12.850370 systemd-logind[1463]: Session 18 logged out. Waiting for processes to exit. Feb 13 08:36:12.850816 systemd-logind[1463]: Removed session 18. Feb 13 08:36:12.763000 audit[3976]: CRED_ACQ pid=3976 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.948259 kernel: audit: type=1101 audit(1707813372.762:1303): pid=3976 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.948302 kernel: audit: type=1103 audit(1707813372.763:1304): pid=3976 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.948318 kernel: audit: type=1006 audit(1707813372.763:1305): pid=3976 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Feb 13 08:36:13.006858 kernel: audit: type=1300 audit(1707813372.763:1305): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe09a6ad70 a2=3 a3=0 items=0 ppid=1 pid=3976 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:12.763000 audit[3976]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe09a6ad70 a2=3 a3=0 items=0 ppid=1 pid=3976 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:13.098900 kernel: audit: type=1327 audit(1707813372.763:1305): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:12.763000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:13.129443 kernel: audit: type=1105 audit(1707813372.768:1306): pid=3976 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.768000 audit[3976]: USER_START pid=3976 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:13.223944 kernel: audit: type=1103 audit(1707813372.769:1307): pid=3978 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.769000 audit[3978]: CRED_ACQ pid=3978 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:13.264544 kubelet[2614]: E0213 08:36:13.264505 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:13.313190 kernel: audit: type=1106 audit(1707813372.846:1308): pid=3976 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.846000 audit[3976]: USER_END pid=3976 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:13.408720 kernel: audit: type=1104 audit(1707813372.846:1309): pid=3976 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.846000 audit[3976]: CRED_DISP pid=3976 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:12.847000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-145.40.67.89:22-139.178.68.195:56090 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:15.264664 kubelet[2614]: E0213 08:36:15.264559 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:15.384196 kubelet[2614]: E0213 08:36:15.384120 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:17.265016 kubelet[2614]: E0213 08:36:17.264901 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:17.856875 systemd[1]: Started sshd@26-145.40.67.89:22-139.178.68.195:38428.service. Feb 13 08:36:17.856000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.67.89:22-139.178.68.195:38428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:17.883995 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:36:17.884044 kernel: audit: type=1130 audit(1707813377.856:1311): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.67.89:22-139.178.68.195:38428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:18.002000 audit[4003]: USER_ACCT pid=4003 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.003166 sshd[4003]: Accepted publickey for core from 139.178.68.195 port 38428 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:18.005236 sshd[4003]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:18.007594 systemd-logind[1463]: New session 19 of user core. Feb 13 08:36:18.008024 systemd[1]: Started session-19.scope. Feb 13 08:36:18.085688 sshd[4003]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:18.087060 systemd[1]: sshd@26-145.40.67.89:22-139.178.68.195:38428.service: Deactivated successfully. Feb 13 08:36:18.087469 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 08:36:18.087802 systemd-logind[1463]: Session 19 logged out. Waiting for processes to exit. Feb 13 08:36:18.088336 systemd-logind[1463]: Removed session 19. Feb 13 08:36:18.004000 audit[4003]: CRED_ACQ pid=4003 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.185017 kernel: audit: type=1101 audit(1707813378.002:1312): pid=4003 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.185052 kernel: audit: type=1103 audit(1707813378.004:1313): pid=4003 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.185068 kernel: audit: type=1006 audit(1707813378.004:1314): pid=4003 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Feb 13 08:36:18.243655 kernel: audit: type=1300 audit(1707813378.004:1314): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc817383a0 a2=3 a3=0 items=0 ppid=1 pid=4003 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:18.004000 audit[4003]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc817383a0 a2=3 a3=0 items=0 ppid=1 pid=4003 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:18.335654 kernel: audit: type=1327 audit(1707813378.004:1314): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:18.004000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:18.366179 kernel: audit: type=1105 audit(1707813378.009:1315): pid=4003 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.009000 audit[4003]: USER_START pid=4003 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.460681 kernel: audit: type=1103 audit(1707813378.010:1316): pid=4005 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.010000 audit[4005]: CRED_ACQ pid=4005 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.085000 audit[4003]: USER_END pid=4003 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.550997 kernel: audit: type=1106 audit(1707813378.085:1317): pid=4003 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.085000 audit[4003]: CRED_DISP pid=4003 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.718337 systemd[1]: Started sshd@27-145.40.67.89:22-161.35.108.241:45544.service. Feb 13 08:36:18.734923 kernel: audit: type=1104 audit(1707813378.085:1318): pid=4003 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:18.086000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-145.40.67.89:22-139.178.68.195:38428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:18.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.67.89:22-161.35.108.241:45544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:19.168968 sshd[4029]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:36:19.168000 audit[4029]: ANOM_LOGIN_FAILURES pid=4029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:19.168000 audit[4029]: USER_AUTH pid=4029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:36:19.169201 sshd[4029]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:36:19.265115 kubelet[2614]: E0213 08:36:19.265101 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:20.385846 kubelet[2614]: E0213 08:36:20.385786 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:21.014120 sshd[4029]: Failed password for root from 161.35.108.241 port 45544 ssh2 Feb 13 08:36:21.265052 kubelet[2614]: E0213 08:36:21.264984 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:22.091149 sshd[4029]: Received disconnect from 161.35.108.241 port 45544:11: Bye Bye [preauth] Feb 13 08:36:22.091149 sshd[4029]: Disconnected from authenticating user root 161.35.108.241 port 45544 [preauth] Feb 13 08:36:22.093662 systemd[1]: sshd@27-145.40.67.89:22-161.35.108.241:45544.service: Deactivated successfully. Feb 13 08:36:22.093000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-145.40.67.89:22-161.35.108.241:45544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:23.094462 systemd[1]: Started sshd@28-145.40.67.89:22-139.178.68.195:38442.service. Feb 13 08:36:23.093000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.67.89:22-139.178.68.195:38442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:23.121536 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:36:23.121603 kernel: audit: type=1130 audit(1707813383.093:1324): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.67.89:22-139.178.68.195:38442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:23.239000 audit[4033]: USER_ACCT pid=4033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.240387 sshd[4033]: Accepted publickey for core from 139.178.68.195 port 38442 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:23.242211 sshd[4033]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:23.244359 systemd-logind[1463]: New session 20 of user core. Feb 13 08:36:23.244836 systemd[1]: Started session-20.scope. Feb 13 08:36:23.264834 kubelet[2614]: E0213 08:36:23.264821 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:23.324923 sshd[4033]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:23.326299 systemd[1]: sshd@28-145.40.67.89:22-139.178.68.195:38442.service: Deactivated successfully. Feb 13 08:36:23.326715 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 08:36:23.327118 systemd-logind[1463]: Session 20 logged out. Waiting for processes to exit. Feb 13 08:36:23.327628 systemd-logind[1463]: Removed session 20. Feb 13 08:36:23.241000 audit[4033]: CRED_ACQ pid=4033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.423155 kernel: audit: type=1101 audit(1707813383.239:1325): pid=4033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.423190 kernel: audit: type=1103 audit(1707813383.241:1326): pid=4033 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.423209 kernel: audit: type=1006 audit(1707813383.241:1327): pid=4033 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Feb 13 08:36:23.482303 kernel: audit: type=1300 audit(1707813383.241:1327): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd16c8e680 a2=3 a3=0 items=0 ppid=1 pid=4033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:23.241000 audit[4033]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd16c8e680 a2=3 a3=0 items=0 ppid=1 pid=4033 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:23.574448 kernel: audit: type=1327 audit(1707813383.241:1327): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:23.241000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:23.605011 kernel: audit: type=1105 audit(1707813383.246:1328): pid=4033 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.246000 audit[4033]: USER_START pid=4033 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.246000 audit[4035]: CRED_ACQ pid=4035 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.788741 kernel: audit: type=1103 audit(1707813383.246:1329): pid=4035 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.788772 kernel: audit: type=1106 audit(1707813383.324:1330): pid=4033 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.324000 audit[4033]: USER_END pid=4033 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.884283 kernel: audit: type=1104 audit(1707813383.324:1331): pid=4033 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.324000 audit[4033]: CRED_DISP pid=4033 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:23.325000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-145.40.67.89:22-139.178.68.195:38442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:25.265131 kubelet[2614]: E0213 08:36:25.265081 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:25.345141 kubelet[2614]: E0213 08:36:25.345043 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.345141 kubelet[2614]: W0213 08:36:25.345089 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.345141 kubelet[2614]: E0213 08:36:25.345136 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.345652 kubelet[2614]: E0213 08:36:25.345623 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.345767 kubelet[2614]: W0213 08:36:25.345655 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.345767 kubelet[2614]: E0213 08:36:25.345694 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.346212 kubelet[2614]: E0213 08:36:25.346142 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.346212 kubelet[2614]: W0213 08:36:25.346166 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.346212 kubelet[2614]: E0213 08:36:25.346200 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.346724 kubelet[2614]: E0213 08:36:25.346667 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.346724 kubelet[2614]: W0213 08:36:25.346690 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.346724 kubelet[2614]: E0213 08:36:25.346723 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.347224 kubelet[2614]: E0213 08:36:25.347156 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.347224 kubelet[2614]: W0213 08:36:25.347178 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.347224 kubelet[2614]: E0213 08:36:25.347211 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.347665 kubelet[2614]: E0213 08:36:25.347625 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.347665 kubelet[2614]: W0213 08:36:25.347649 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.347983 kubelet[2614]: E0213 08:36:25.347678 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.348218 kubelet[2614]: E0213 08:36:25.348145 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.348218 kubelet[2614]: W0213 08:36:25.348177 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.348218 kubelet[2614]: E0213 08:36:25.348216 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.348812 kubelet[2614]: E0213 08:36:25.348719 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.348812 kubelet[2614]: W0213 08:36:25.348753 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.348812 kubelet[2614]: E0213 08:36:25.348793 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.384684 kubelet[2614]: E0213 08:36:25.384571 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.384684 kubelet[2614]: W0213 08:36:25.384615 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.384684 kubelet[2614]: E0213 08:36:25.384660 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.385345 kubelet[2614]: E0213 08:36:25.385304 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.385345 kubelet[2614]: W0213 08:36:25.385337 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.385556 kubelet[2614]: E0213 08:36:25.385381 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.386019 kubelet[2614]: E0213 08:36:25.385952 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.386019 kubelet[2614]: W0213 08:36:25.385992 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.386019 kubelet[2614]: E0213 08:36:25.386032 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.386597 kubelet[2614]: E0213 08:36:25.386560 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.386597 kubelet[2614]: W0213 08:36:25.386595 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.386829 kubelet[2614]: E0213 08:36:25.386634 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.387247 kubelet[2614]: E0213 08:36:25.387163 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:25.387486 kubelet[2614]: E0213 08:36:25.387259 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.387486 kubelet[2614]: W0213 08:36:25.387294 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.387486 kubelet[2614]: E0213 08:36:25.387333 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:25.388324 kubelet[2614]: E0213 08:36:25.388242 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:25.388324 kubelet[2614]: W0213 08:36:25.388275 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:25.388324 kubelet[2614]: E0213 08:36:25.388314 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:26.361386 systemd[1]: Started sshd@29-145.40.67.89:22-43.153.15.221:51334.service. Feb 13 08:36:26.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.67.89:22-43.153.15.221:51334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:26.480320 sshd[4074]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:36:26.479000 audit[4074]: USER_AUTH pid=4074 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:36:27.264853 kubelet[2614]: E0213 08:36:27.264836 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:28.285182 sshd[4074]: Failed password for root from 43.153.15.221 port 51334 ssh2 Feb 13 08:36:28.334480 systemd[1]: Started sshd@30-145.40.67.89:22-139.178.68.195:52970.service. Feb 13 08:36:28.333000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.67.89:22-139.178.68.195:52970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:28.361412 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:36:28.361445 kernel: audit: type=1130 audit(1707813388.333:1335): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.67.89:22-139.178.68.195:52970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:28.478000 audit[4077]: USER_ACCT pid=4077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.479993 sshd[4077]: Accepted publickey for core from 139.178.68.195 port 52970 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:28.482224 sshd[4077]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:28.484502 systemd-logind[1463]: New session 21 of user core. Feb 13 08:36:28.485033 systemd[1]: Started session-21.scope. Feb 13 08:36:28.565188 sshd[4077]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:28.566756 systemd[1]: sshd@30-145.40.67.89:22-139.178.68.195:52970.service: Deactivated successfully. Feb 13 08:36:28.567204 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 08:36:28.567647 systemd-logind[1463]: Session 21 logged out. Waiting for processes to exit. Feb 13 08:36:28.568250 systemd-logind[1463]: Removed session 21. Feb 13 08:36:28.481000 audit[4077]: CRED_ACQ pid=4077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.664010 kernel: audit: type=1101 audit(1707813388.478:1336): pid=4077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.664039 kernel: audit: type=1103 audit(1707813388.481:1337): pid=4077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.664051 kernel: audit: type=1006 audit(1707813388.481:1338): pid=4077 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Feb 13 08:36:28.722747 kernel: audit: type=1300 audit(1707813388.481:1338): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd466e5050 a2=3 a3=0 items=0 ppid=1 pid=4077 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:28.481000 audit[4077]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd466e5050 a2=3 a3=0 items=0 ppid=1 pid=4077 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:28.814773 kernel: audit: type=1327 audit(1707813388.481:1338): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:28.481000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:28.845273 kernel: audit: type=1105 audit(1707813388.486:1339): pid=4077 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.486000 audit[4077]: USER_START pid=4077 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.487000 audit[4079]: CRED_ACQ pid=4079 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:29.029003 kernel: audit: type=1103 audit(1707813388.487:1340): pid=4079 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:29.029065 kernel: audit: type=1106 audit(1707813388.565:1341): pid=4077 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.565000 audit[4077]: USER_END pid=4077 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:29.124569 kernel: audit: type=1104 audit(1707813388.565:1342): pid=4077 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.565000 audit[4077]: CRED_DISP pid=4077 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:28.565000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-145.40.67.89:22-139.178.68.195:52970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:29.264489 kubelet[2614]: E0213 08:36:29.264443 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:29.349967 sshd[4074]: Received disconnect from 43.153.15.221 port 51334:11: Bye Bye [preauth] Feb 13 08:36:29.349967 sshd[4074]: Disconnected from authenticating user root 43.153.15.221 port 51334 [preauth] Feb 13 08:36:29.352533 systemd[1]: sshd@29-145.40.67.89:22-43.153.15.221:51334.service: Deactivated successfully. Feb 13 08:36:29.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-145.40.67.89:22-43.153.15.221:51334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:30.389262 kubelet[2614]: E0213 08:36:30.389077 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:31.264911 kubelet[2614]: E0213 08:36:31.264893 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:31.298421 kubelet[2614]: E0213 08:36:31.298368 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.298421 kubelet[2614]: W0213 08:36:31.298388 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.298421 kubelet[2614]: E0213 08:36:31.298409 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.298632 kubelet[2614]: E0213 08:36:31.298617 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.298632 kubelet[2614]: W0213 08:36:31.298629 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.298702 kubelet[2614]: E0213 08:36:31.298643 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.298872 kubelet[2614]: E0213 08:36:31.298840 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.298872 kubelet[2614]: W0213 08:36:31.298849 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.298872 kubelet[2614]: E0213 08:36:31.298860 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.299098 kubelet[2614]: E0213 08:36:31.299059 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.299098 kubelet[2614]: W0213 08:36:31.299070 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.299098 kubelet[2614]: E0213 08:36:31.299083 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.299297 kubelet[2614]: E0213 08:36:31.299286 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.299344 kubelet[2614]: W0213 08:36:31.299298 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.299344 kubelet[2614]: E0213 08:36:31.299311 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.299520 kubelet[2614]: E0213 08:36:31.299510 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.299557 kubelet[2614]: W0213 08:36:31.299521 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.299557 kubelet[2614]: E0213 08:36:31.299534 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.299741 kubelet[2614]: E0213 08:36:31.299714 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.299741 kubelet[2614]: W0213 08:36:31.299722 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.299741 kubelet[2614]: E0213 08:36:31.299733 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.299865 kubelet[2614]: E0213 08:36:31.299857 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.299902 kubelet[2614]: W0213 08:36:31.299865 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.299902 kubelet[2614]: E0213 08:36:31.299875 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.300068 kubelet[2614]: E0213 08:36:31.300032 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.300068 kubelet[2614]: W0213 08:36:31.300039 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.300068 kubelet[2614]: E0213 08:36:31.300048 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.300206 kubelet[2614]: E0213 08:36:31.300162 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.300206 kubelet[2614]: W0213 08:36:31.300169 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.300206 kubelet[2614]: E0213 08:36:31.300178 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.300302 kubelet[2614]: E0213 08:36:31.300284 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.300302 kubelet[2614]: W0213 08:36:31.300291 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.300302 kubelet[2614]: E0213 08:36:31.300300 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:31.300411 kubelet[2614]: E0213 08:36:31.300402 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:31.300411 kubelet[2614]: W0213 08:36:31.300409 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:31.300482 kubelet[2614]: E0213 08:36:31.300419 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:33.265516 kubelet[2614]: E0213 08:36:33.265413 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:33.577335 systemd[1]: Started sshd@31-145.40.67.89:22-139.178.68.195:52978.service. Feb 13 08:36:33.576000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.67.89:22-139.178.68.195:52978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:33.617640 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:36:33.617738 kernel: audit: type=1130 audit(1707813393.576:1345): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.67.89:22-139.178.68.195:52978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:33.733000 audit[4116]: USER_ACCT pid=4116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.735092 sshd[4116]: Accepted publickey for core from 139.178.68.195 port 52978 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:33.737240 sshd[4116]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:33.739563 systemd-logind[1463]: New session 22 of user core. Feb 13 08:36:33.740048 systemd[1]: Started session-22.scope. Feb 13 08:36:33.818040 sshd[4116]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:33.819407 systemd[1]: sshd@31-145.40.67.89:22-139.178.68.195:52978.service: Deactivated successfully. Feb 13 08:36:33.819828 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 08:36:33.820264 systemd-logind[1463]: Session 22 logged out. Waiting for processes to exit. Feb 13 08:36:33.820795 systemd-logind[1463]: Removed session 22. Feb 13 08:36:33.736000 audit[4116]: CRED_ACQ pid=4116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.917047 kernel: audit: type=1101 audit(1707813393.733:1346): pid=4116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.917113 kernel: audit: type=1103 audit(1707813393.736:1347): pid=4116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.917128 kernel: audit: type=1006 audit(1707813393.736:1348): pid=4116 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Feb 13 08:36:33.975671 kernel: audit: type=1300 audit(1707813393.736:1348): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff4be5470 a2=3 a3=0 items=0 ppid=1 pid=4116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:33.736000 audit[4116]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffff4be5470 a2=3 a3=0 items=0 ppid=1 pid=4116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:34.067705 kernel: audit: type=1327 audit(1707813393.736:1348): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:33.736000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:34.098202 kernel: audit: type=1105 audit(1707813393.741:1349): pid=4116 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.741000 audit[4116]: USER_START pid=4116 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.742000 audit[4118]: CRED_ACQ pid=4118 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:34.281941 kernel: audit: type=1103 audit(1707813393.742:1350): pid=4118 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:34.282025 kernel: audit: type=1106 audit(1707813393.817:1351): pid=4116 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.817000 audit[4116]: USER_END pid=4116 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.817000 audit[4116]: CRED_DISP pid=4116 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:34.466849 kernel: audit: type=1104 audit(1707813393.817:1352): pid=4116 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:33.818000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-145.40.67.89:22-139.178.68.195:52978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:35.264548 kubelet[2614]: E0213 08:36:35.264497 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:35.390296 kubelet[2614]: E0213 08:36:35.390226 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:35.608000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.608000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002e38020 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:36:35.608000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:36:35.608000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.608000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001103d70 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:36:35.608000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:36:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c011519860 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:36:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:36:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0140e5c40 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:36:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:36:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=64 a1=c0115199b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:36:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:36:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00712d890 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:36:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:36:35.890000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.890000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c009b857d0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:36:35.890000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:36:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00e67f140 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:36:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:36:37.264319 kubelet[2614]: E0213 08:36:37.264271 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:38.827294 systemd[1]: Started sshd@32-145.40.67.89:22-139.178.68.195:51156.service. Feb 13 08:36:38.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.67.89:22-139.178.68.195:51156 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:38.854508 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:36:38.854564 kernel: audit: type=1130 audit(1707813398.826:1362): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.67.89:22-139.178.68.195:51156 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:38.972000 audit[4141]: USER_ACCT pid=4141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:38.973117 sshd[4141]: Accepted publickey for core from 139.178.68.195 port 51156 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:38.974206 sshd[4141]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:38.976575 systemd-logind[1463]: New session 23 of user core. Feb 13 08:36:38.977060 systemd[1]: Started session-23.scope. Feb 13 08:36:39.057147 sshd[4141]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:39.058642 systemd[1]: sshd@32-145.40.67.89:22-139.178.68.195:51156.service: Deactivated successfully. Feb 13 08:36:39.059113 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 08:36:39.059525 systemd-logind[1463]: Session 23 logged out. Waiting for processes to exit. Feb 13 08:36:39.059923 systemd-logind[1463]: Removed session 23. Feb 13 08:36:38.973000 audit[4141]: CRED_ACQ pid=4141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.157140 kernel: audit: type=1101 audit(1707813398.972:1363): pid=4141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.157177 kernel: audit: type=1103 audit(1707813398.973:1364): pid=4141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.157203 kernel: audit: type=1006 audit(1707813398.973:1365): pid=4141 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Feb 13 08:36:39.215742 kernel: audit: type=1300 audit(1707813398.973:1365): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd44dcae90 a2=3 a3=0 items=0 ppid=1 pid=4141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:38.973000 audit[4141]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd44dcae90 a2=3 a3=0 items=0 ppid=1 pid=4141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:39.264121 kubelet[2614]: E0213 08:36:39.264108 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:39.307773 kernel: audit: type=1327 audit(1707813398.973:1365): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:38.973000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:39.338316 kernel: audit: type=1105 audit(1707813398.978:1366): pid=4141 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:38.978000 audit[4141]: USER_START pid=4141 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.432825 kernel: audit: type=1103 audit(1707813398.978:1367): pid=4143 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:38.978000 audit[4143]: CRED_ACQ pid=4143 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.056000 audit[4141]: USER_END pid=4141 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.617732 kernel: audit: type=1106 audit(1707813399.056:1368): pid=4141 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.617772 kernel: audit: type=1104 audit(1707813399.056:1369): pid=4141 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.056000 audit[4141]: CRED_DISP pid=4141 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:39.057000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-145.40.67.89:22-139.178.68.195:51156 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:40.392013 kubelet[2614]: E0213 08:36:40.391907 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:40.947000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:40.947000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002399d60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:36:40.947000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:36:40.947000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:40.947000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0018cd080 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:36:40.947000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:36:40.949000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:40.949000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0018cd0a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:36:40.949000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:36:40.950000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:36:40.950000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000bfcd40 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:36:40.950000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:36:41.264318 kubelet[2614]: E0213 08:36:41.264264 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:42.278332 kubelet[2614]: E0213 08:36:42.278234 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:42.278332 kubelet[2614]: W0213 08:36:42.278277 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:42.278332 kubelet[2614]: E0213 08:36:42.278324 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:42.279401 kubelet[2614]: E0213 08:36:42.278874 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:42.279401 kubelet[2614]: W0213 08:36:42.278908 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:42.279401 kubelet[2614]: E0213 08:36:42.278964 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:42.279734 kubelet[2614]: E0213 08:36:42.279513 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:42.279734 kubelet[2614]: W0213 08:36:42.279545 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:42.279734 kubelet[2614]: E0213 08:36:42.279588 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:42.280264 kubelet[2614]: E0213 08:36:42.280173 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:42.280264 kubelet[2614]: W0213 08:36:42.280207 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:42.280264 kubelet[2614]: E0213 08:36:42.280251 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:43.265251 kubelet[2614]: E0213 08:36:43.265205 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:44.066870 systemd[1]: Started sshd@33-145.40.67.89:22-139.178.68.195:51160.service. Feb 13 08:36:44.066000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.67.89:22-139.178.68.195:51160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:44.093987 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:36:44.094053 kernel: audit: type=1130 audit(1707813404.066:1375): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.67.89:22-139.178.68.195:51160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:44.210000 audit[4174]: USER_ACCT pid=4174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.211603 sshd[4174]: Accepted publickey for core from 139.178.68.195 port 51160 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:44.212285 sshd[4174]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:44.214665 systemd-logind[1463]: New session 24 of user core. Feb 13 08:36:44.215172 systemd[1]: Started session-24.scope. Feb 13 08:36:44.295549 sshd[4174]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:44.297075 systemd[1]: sshd@33-145.40.67.89:22-139.178.68.195:51160.service: Deactivated successfully. Feb 13 08:36:44.297672 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 08:36:44.298089 systemd-logind[1463]: Session 24 logged out. Waiting for processes to exit. Feb 13 08:36:44.298699 systemd-logind[1463]: Removed session 24. Feb 13 08:36:44.211000 audit[4174]: CRED_ACQ pid=4174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.396856 kernel: audit: type=1101 audit(1707813404.210:1376): pid=4174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.396888 kernel: audit: type=1103 audit(1707813404.211:1377): pid=4174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.396906 kernel: audit: type=1006 audit(1707813404.211:1378): pid=4174 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Feb 13 08:36:44.455523 kernel: audit: type=1300 audit(1707813404.211:1378): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd239ccb80 a2=3 a3=0 items=0 ppid=1 pid=4174 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:44.211000 audit[4174]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd239ccb80 a2=3 a3=0 items=0 ppid=1 pid=4174 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:44.547637 kernel: audit: type=1327 audit(1707813404.211:1378): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:44.211000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:44.578171 kernel: audit: type=1105 audit(1707813404.216:1379): pid=4174 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.216000 audit[4174]: USER_START pid=4174 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.672837 kernel: audit: type=1103 audit(1707813404.216:1380): pid=4176 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.216000 audit[4176]: CRED_ACQ pid=4176 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.762176 kernel: audit: type=1106 audit(1707813404.295:1381): pid=4174 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.295000 audit[4174]: USER_END pid=4174 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.857812 kernel: audit: type=1104 audit(1707813404.295:1382): pid=4174 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.295000 audit[4174]: CRED_DISP pid=4174 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:44.296000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-145.40.67.89:22-139.178.68.195:51160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:45.265031 kubelet[2614]: E0213 08:36:45.264984 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:45.393679 kubelet[2614]: E0213 08:36:45.393578 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:47.265072 kubelet[2614]: E0213 08:36:47.265053 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:49.265102 kubelet[2614]: E0213 08:36:49.265082 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:49.305287 systemd[1]: Started sshd@34-145.40.67.89:22-139.178.68.195:47250.service. Feb 13 08:36:49.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.67.89:22-139.178.68.195:47250 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:49.332108 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:36:49.332195 kernel: audit: type=1130 audit(1707813409.304:1384): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.67.89:22-139.178.68.195:47250 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:49.449000 audit[4201]: USER_ACCT pid=4201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.450394 sshd[4201]: Accepted publickey for core from 139.178.68.195 port 47250 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:49.452238 sshd[4201]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:49.454614 systemd-logind[1463]: New session 25 of user core. Feb 13 08:36:49.455130 systemd[1]: Started session-25.scope. Feb 13 08:36:49.535740 sshd[4201]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:49.537297 systemd[1]: sshd@34-145.40.67.89:22-139.178.68.195:47250.service: Deactivated successfully. Feb 13 08:36:49.537841 systemd[1]: session-25.scope: Deactivated successfully. Feb 13 08:36:49.538335 systemd-logind[1463]: Session 25 logged out. Waiting for processes to exit. Feb 13 08:36:49.538818 systemd-logind[1463]: Removed session 25. Feb 13 08:36:49.451000 audit[4201]: CRED_ACQ pid=4201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.635267 kernel: audit: type=1101 audit(1707813409.449:1385): pid=4201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.635298 kernel: audit: type=1103 audit(1707813409.451:1386): pid=4201 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.635316 kernel: audit: type=1006 audit(1707813409.451:1387): pid=4201 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Feb 13 08:36:49.694012 kernel: audit: type=1300 audit(1707813409.451:1387): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd72235550 a2=3 a3=0 items=0 ppid=1 pid=4201 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:49.451000 audit[4201]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd72235550 a2=3 a3=0 items=0 ppid=1 pid=4201 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:49.786063 kernel: audit: type=1327 audit(1707813409.451:1387): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:49.451000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:49.816632 kernel: audit: type=1105 audit(1707813409.456:1388): pid=4201 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.456000 audit[4201]: USER_START pid=4201 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.911201 kernel: audit: type=1103 audit(1707813409.456:1389): pid=4203 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.456000 audit[4203]: CRED_ACQ pid=4203 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:50.000485 kernel: audit: type=1106 audit(1707813409.535:1390): pid=4201 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.535000 audit[4201]: USER_END pid=4201 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:50.096077 kernel: audit: type=1104 audit(1707813409.535:1391): pid=4201 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.535000 audit[4201]: CRED_DISP pid=4201 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:49.536000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-145.40.67.89:22-139.178.68.195:47250 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:50.396030 kubelet[2614]: E0213 08:36:50.395790 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:51.265148 kubelet[2614]: E0213 08:36:51.265102 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:53.264936 kubelet[2614]: E0213 08:36:53.264889 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:54.537600 systemd[1]: Started sshd@35-145.40.67.89:22-61.83.148.111:53346.service. Feb 13 08:36:54.536000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.67.89:22-61.83.148.111:53346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:54.539802 systemd[1]: Started sshd@36-145.40.67.89:22-139.178.68.195:47254.service. Feb 13 08:36:54.564653 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:36:54.564709 kernel: audit: type=1130 audit(1707813414.536:1393): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.67.89:22-61.83.148.111:53346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:54.538000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.67.89:22-139.178.68.195:47254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:54.681205 sshd[4230]: Accepted publickey for core from 139.178.68.195 port 47254 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:54.683254 sshd[4230]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:54.685560 systemd-logind[1463]: New session 26 of user core. Feb 13 08:36:54.686180 systemd[1]: Started session-26.scope. Feb 13 08:36:54.740622 kernel: audit: type=1130 audit(1707813414.538:1394): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.67.89:22-139.178.68.195:47254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:54.740692 kernel: audit: type=1101 audit(1707813414.680:1395): pid=4230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.680000 audit[4230]: USER_ACCT pid=4230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.764237 sshd[4230]: pam_unix(sshd:session): session closed for user core Feb 13 08:36:54.765639 systemd[1]: sshd@36-145.40.67.89:22-139.178.68.195:47254.service: Deactivated successfully. Feb 13 08:36:54.766058 systemd[1]: session-26.scope: Deactivated successfully. Feb 13 08:36:54.766465 systemd-logind[1463]: Session 26 logged out. Waiting for processes to exit. Feb 13 08:36:54.766884 systemd-logind[1463]: Removed session 26. Feb 13 08:36:54.832678 kernel: audit: type=1103 audit(1707813414.682:1396): pid=4230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.682000 audit[4230]: CRED_ACQ pid=4230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.923233 kernel: audit: type=1006 audit(1707813414.682:1397): pid=4230 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Feb 13 08:36:54.981860 kernel: audit: type=1300 audit(1707813414.682:1397): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2f0ce190 a2=3 a3=0 items=0 ppid=1 pid=4230 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:54.682000 audit[4230]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2f0ce190 a2=3 a3=0 items=0 ppid=1 pid=4230 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:55.073919 kernel: audit: type=1327 audit(1707813414.682:1397): proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:54.682000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:36:55.104460 kernel: audit: type=1105 audit(1707813414.687:1398): pid=4230 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.687000 audit[4230]: USER_START pid=4230 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:55.199841 kernel: audit: type=1103 audit(1707813414.687:1399): pid=4233 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.687000 audit[4233]: CRED_ACQ pid=4233 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:55.264971 kubelet[2614]: E0213 08:36:55.264959 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:55.289150 kernel: audit: type=1106 audit(1707813414.763:1400): pid=4230 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.763000 audit[4230]: USER_END pid=4230 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:55.334967 sshd[4228]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=61.83.148.111 user=root Feb 13 08:36:55.335013 sshd[4228]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:36:54.764000 audit[4230]: CRED_DISP pid=4230 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:54.764000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-145.40.67.89:22-139.178.68.195:47254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:55.334000 audit[4228]: ANOM_LOGIN_FAILURES pid=4228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:55.334000 audit[4228]: USER_AUTH pid=4228 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=61.83.148.111 addr=61.83.148.111 terminal=ssh res=failed' Feb 13 08:36:55.396501 kubelet[2614]: E0213 08:36:55.396491 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:36:56.285789 kubelet[2614]: E0213 08:36:56.285737 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.285789 kubelet[2614]: W0213 08:36:56.285751 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.285789 kubelet[2614]: E0213 08:36:56.285767 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286128 kubelet[2614]: E0213 08:36:56.285907 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286128 kubelet[2614]: W0213 08:36:56.285914 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.286128 kubelet[2614]: E0213 08:36:56.285922 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286128 kubelet[2614]: E0213 08:36:56.286084 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286128 kubelet[2614]: W0213 08:36:56.286093 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.286128 kubelet[2614]: E0213 08:36:56.286105 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286345 kubelet[2614]: E0213 08:36:56.286303 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286345 kubelet[2614]: W0213 08:36:56.286311 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.286345 kubelet[2614]: E0213 08:36:56.286322 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286570 kubelet[2614]: E0213 08:36:56.286528 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286570 kubelet[2614]: W0213 08:36:56.286538 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.286570 kubelet[2614]: E0213 08:36:56.286549 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286703 kubelet[2614]: E0213 08:36:56.286691 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286703 kubelet[2614]: W0213 08:36:56.286698 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.286764 kubelet[2614]: E0213 08:36:56.286706 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286842 kubelet[2614]: E0213 08:36:56.286834 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286873 kubelet[2614]: W0213 08:36:56.286845 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.286873 kubelet[2614]: E0213 08:36:56.286856 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.286956 kubelet[2614]: E0213 08:36:56.286949 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.286956 kubelet[2614]: W0213 08:36:56.286955 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287013 kubelet[2614]: E0213 08:36:56.286964 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287102 kubelet[2614]: E0213 08:36:56.287096 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287138 kubelet[2614]: W0213 08:36:56.287101 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287138 kubelet[2614]: E0213 08:36:56.287109 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287263 kubelet[2614]: E0213 08:36:56.287256 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287263 kubelet[2614]: W0213 08:36:56.287262 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287320 kubelet[2614]: E0213 08:36:56.287270 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287357 kubelet[2614]: E0213 08:36:56.287351 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287389 kubelet[2614]: W0213 08:36:56.287358 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287389 kubelet[2614]: E0213 08:36:56.287367 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287454 kubelet[2614]: E0213 08:36:56.287448 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287454 kubelet[2614]: W0213 08:36:56.287454 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287511 kubelet[2614]: E0213 08:36:56.287461 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287556 kubelet[2614]: E0213 08:36:56.287550 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287585 kubelet[2614]: W0213 08:36:56.287556 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287585 kubelet[2614]: E0213 08:36:56.287564 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287652 kubelet[2614]: E0213 08:36:56.287646 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287652 kubelet[2614]: W0213 08:36:56.287652 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287706 kubelet[2614]: E0213 08:36:56.287659 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287748 kubelet[2614]: E0213 08:36:56.287742 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287748 kubelet[2614]: W0213 08:36:56.287748 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287805 kubelet[2614]: E0213 08:36:56.287755 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.287845 kubelet[2614]: E0213 08:36:56.287839 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.287845 kubelet[2614]: W0213 08:36:56.287845 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.287899 kubelet[2614]: E0213 08:36:56.287852 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.367660 kubelet[2614]: E0213 08:36:56.367589 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.367660 kubelet[2614]: W0213 08:36:56.367639 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.368244 kubelet[2614]: E0213 08:36:56.367705 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.368451 kubelet[2614]: E0213 08:36:56.368376 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.368451 kubelet[2614]: W0213 08:36:56.368412 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.368763 kubelet[2614]: E0213 08:36:56.368473 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.369131 kubelet[2614]: E0213 08:36:56.369054 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.369131 kubelet[2614]: W0213 08:36:56.369089 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.369131 kubelet[2614]: E0213 08:36:56.369137 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.369756 kubelet[2614]: E0213 08:36:56.369701 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.369756 kubelet[2614]: W0213 08:36:56.369749 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.370153 kubelet[2614]: E0213 08:36:56.369816 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.370432 kubelet[2614]: E0213 08:36:56.370393 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.370432 kubelet[2614]: W0213 08:36:56.370426 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.370798 kubelet[2614]: E0213 08:36:56.370490 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.371055 kubelet[2614]: E0213 08:36:56.371016 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.371055 kubelet[2614]: W0213 08:36:56.371047 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.371404 kubelet[2614]: E0213 08:36:56.371179 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.371598 kubelet[2614]: E0213 08:36:56.371562 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.371792 kubelet[2614]: W0213 08:36:56.371601 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.371792 kubelet[2614]: E0213 08:36:56.371660 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.372178 kubelet[2614]: E0213 08:36:56.372138 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.372178 kubelet[2614]: W0213 08:36:56.372171 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.372545 kubelet[2614]: E0213 08:36:56.372224 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.372828 kubelet[2614]: E0213 08:36:56.372793 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.372828 kubelet[2614]: W0213 08:36:56.372824 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.373205 kubelet[2614]: E0213 08:36:56.372974 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.373373 kubelet[2614]: E0213 08:36:56.373354 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.373559 kubelet[2614]: W0213 08:36:56.373386 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.373559 kubelet[2614]: E0213 08:36:56.373437 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.373973 kubelet[2614]: E0213 08:36:56.373911 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.373973 kubelet[2614]: W0213 08:36:56.373971 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.374330 kubelet[2614]: E0213 08:36:56.374023 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:56.374535 kubelet[2614]: E0213 08:36:56.374529 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:36:56.374568 kubelet[2614]: W0213 08:36:56.374535 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:36:56.374568 kubelet[2614]: E0213 08:36:56.374544 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:36:57.264853 kubelet[2614]: E0213 08:36:57.264833 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:57.456778 sshd[4228]: Failed password for root from 61.83.148.111 port 53346 ssh2 Feb 13 08:36:58.327316 sshd[4228]: Received disconnect from 61.83.148.111 port 53346:11: Bye Bye [preauth] Feb 13 08:36:58.327316 sshd[4228]: Disconnected from authenticating user root 61.83.148.111 port 53346 [preauth] Feb 13 08:36:58.329868 systemd[1]: sshd@35-145.40.67.89:22-61.83.148.111:53346.service: Deactivated successfully. Feb 13 08:36:58.329000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-145.40.67.89:22-61.83.148.111:53346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:59.264306 kubelet[2614]: E0213 08:36:59.264261 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:36:59.773239 systemd[1]: Started sshd@37-145.40.67.89:22-139.178.68.195:39084.service. Feb 13 08:36:59.772000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.67.89:22-139.178.68.195:39084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:59.800160 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:36:59.800222 kernel: audit: type=1130 audit(1707813419.772:1406): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.67.89:22-139.178.68.195:39084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:36:59.916000 audit[4286]: USER_ACCT pid=4286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:59.917831 sshd[4286]: Accepted publickey for core from 139.178.68.195 port 39084 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:36:59.919219 sshd[4286]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:36:59.921628 systemd-logind[1463]: New session 27 of user core. Feb 13 08:36:59.922142 systemd[1]: Started session-27.scope. Feb 13 08:37:00.001639 sshd[4286]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:00.003005 systemd[1]: sshd@37-145.40.67.89:22-139.178.68.195:39084.service: Deactivated successfully. Feb 13 08:37:00.003418 systemd[1]: session-27.scope: Deactivated successfully. Feb 13 08:37:00.003773 systemd-logind[1463]: Session 27 logged out. Waiting for processes to exit. Feb 13 08:37:00.004350 systemd-logind[1463]: Removed session 27. Feb 13 08:36:59.918000 audit[4286]: CRED_ACQ pid=4286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.102820 kernel: audit: type=1101 audit(1707813419.916:1407): pid=4286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.102857 kernel: audit: type=1103 audit(1707813419.918:1408): pid=4286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.102874 kernel: audit: type=1006 audit(1707813419.918:1409): pid=4286 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Feb 13 08:37:00.161750 kernel: audit: type=1300 audit(1707813419.918:1409): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0b50cb30 a2=3 a3=0 items=0 ppid=1 pid=4286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:59.918000 audit[4286]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe0b50cb30 a2=3 a3=0 items=0 ppid=1 pid=4286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:36:59.918000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:00.285007 kernel: audit: type=1327 audit(1707813419.918:1409): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:00.285031 kernel: audit: type=1105 audit(1707813419.923:1410): pid=4286 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:59.923000 audit[4286]: USER_START pid=4286 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.380031 kernel: audit: type=1103 audit(1707813419.923:1411): pid=4288 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:36:59.923000 audit[4288]: CRED_ACQ pid=4288 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.397494 kubelet[2614]: E0213 08:37:00.397445 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:00.469748 kernel: audit: type=1106 audit(1707813420.001:1412): pid=4286 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.001000 audit[4286]: USER_END pid=4286 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.565845 kernel: audit: type=1104 audit(1707813420.001:1413): pid=4286 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.001000 audit[4286]: CRED_DISP pid=4286 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:00.002000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-145.40.67.89:22-139.178.68.195:39084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:01.265328 kubelet[2614]: E0213 08:37:01.265256 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:03.264775 kubelet[2614]: E0213 08:37:03.264693 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:05.013743 systemd[1]: Started sshd@38-145.40.67.89:22-139.178.68.195:39088.service. Feb 13 08:37:05.013000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.67.89:22-139.178.68.195:39088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:05.054999 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:05.055086 kernel: audit: type=1130 audit(1707813425.013:1415): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.67.89:22-139.178.68.195:39088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:05.086651 sshd[4314]: Accepted publickey for core from 139.178.68.195 port 39088 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:05.089287 sshd[4314]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:05.092398 systemd-logind[1463]: New session 28 of user core. Feb 13 08:37:05.092897 systemd[1]: Started session-28.scope. Feb 13 08:37:05.085000 audit[4314]: USER_ACCT pid=4314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.169617 sshd[4314]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:05.171027 systemd[1]: sshd@38-145.40.67.89:22-139.178.68.195:39088.service: Deactivated successfully. Feb 13 08:37:05.171453 systemd[1]: session-28.scope: Deactivated successfully. Feb 13 08:37:05.171782 systemd-logind[1463]: Session 28 logged out. Waiting for processes to exit. Feb 13 08:37:05.172201 systemd-logind[1463]: Removed session 28. Feb 13 08:37:05.235635 kernel: audit: type=1101 audit(1707813425.085:1416): pid=4314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.235671 kernel: audit: type=1103 audit(1707813425.088:1417): pid=4314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.088000 audit[4314]: CRED_ACQ pid=4314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.265109 kubelet[2614]: E0213 08:37:05.265042 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:05.384984 kernel: audit: type=1006 audit(1707813425.088:1418): pid=4314 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Feb 13 08:37:05.385036 kernel: audit: type=1300 audit(1707813425.088:1418): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdaed92650 a2=3 a3=0 items=0 ppid=1 pid=4314 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:05.088000 audit[4314]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdaed92650 a2=3 a3=0 items=0 ppid=1 pid=4314 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:05.398721 kubelet[2614]: E0213 08:37:05.398702 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:05.477616 kernel: audit: type=1327 audit(1707813425.088:1418): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:05.088000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:05.508326 kernel: audit: type=1105 audit(1707813425.093:1419): pid=4314 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.093000 audit[4314]: USER_START pid=4314 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.603349 kernel: audit: type=1103 audit(1707813425.095:1420): pid=4316 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.095000 audit[4316]: CRED_ACQ pid=4316 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.169000 audit[4314]: USER_END pid=4314 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.788205 kernel: audit: type=1106 audit(1707813425.169:1421): pid=4314 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.788261 kernel: audit: type=1104 audit(1707813425.169:1422): pid=4314 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.169000 audit[4314]: CRED_DISP pid=4314 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:05.170000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-145.40.67.89:22-139.178.68.195:39088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:07.264458 kubelet[2614]: E0213 08:37:07.264383 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:09.264258 kubelet[2614]: E0213 08:37:09.264213 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:10.179112 systemd[1]: Started sshd@39-145.40.67.89:22-139.178.68.195:36126.service. Feb 13 08:37:10.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.67.89:22-139.178.68.195:36126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:10.206168 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:10.206241 kernel: audit: type=1130 audit(1707813430.178:1424): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.67.89:22-139.178.68.195:36126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:10.324000 audit[4340]: USER_ACCT pid=4340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.325417 sshd[4340]: Accepted publickey for core from 139.178.68.195 port 36126 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:10.327254 sshd[4340]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:10.329619 systemd-logind[1463]: New session 29 of user core. Feb 13 08:37:10.330167 systemd[1]: Started session-29.scope. Feb 13 08:37:10.400125 kubelet[2614]: E0213 08:37:10.400082 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:10.408025 sshd[4340]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:10.409365 systemd[1]: sshd@39-145.40.67.89:22-139.178.68.195:36126.service: Deactivated successfully. Feb 13 08:37:10.409804 systemd[1]: session-29.scope: Deactivated successfully. Feb 13 08:37:10.410223 systemd-logind[1463]: Session 29 logged out. Waiting for processes to exit. Feb 13 08:37:10.410733 systemd-logind[1463]: Removed session 29. Feb 13 08:37:10.326000 audit[4340]: CRED_ACQ pid=4340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.507394 kernel: audit: type=1101 audit(1707813430.324:1425): pid=4340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.507432 kernel: audit: type=1103 audit(1707813430.326:1426): pid=4340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.507449 kernel: audit: type=1006 audit(1707813430.326:1427): pid=4340 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Feb 13 08:37:10.565953 kernel: audit: type=1300 audit(1707813430.326:1427): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeb06d36d0 a2=3 a3=0 items=0 ppid=1 pid=4340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:10.326000 audit[4340]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeb06d36d0 a2=3 a3=0 items=0 ppid=1 pid=4340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:10.658004 kernel: audit: type=1327 audit(1707813430.326:1427): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:10.326000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:10.688483 kernel: audit: type=1105 audit(1707813430.331:1428): pid=4340 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.331000 audit[4340]: USER_START pid=4340 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.783013 kernel: audit: type=1103 audit(1707813430.332:1429): pid=4342 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.332000 audit[4342]: CRED_ACQ pid=4342 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.872353 kernel: audit: type=1106 audit(1707813430.407:1430): pid=4340 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.407000 audit[4340]: USER_END pid=4340 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.968008 kernel: audit: type=1104 audit(1707813430.407:1431): pid=4340 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.407000 audit[4340]: CRED_DISP pid=4340 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:10.408000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-145.40.67.89:22-139.178.68.195:36126 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:11.265093 kubelet[2614]: E0213 08:37:11.265048 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:13.265514 kubelet[2614]: E0213 08:37:13.265400 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:15.264235 kubelet[2614]: E0213 08:37:15.264189 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:15.401554 kubelet[2614]: E0213 08:37:15.401456 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:15.417172 systemd[1]: Started sshd@40-145.40.67.89:22-139.178.68.195:36128.service. Feb 13 08:37:15.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.67.89:22-139.178.68.195:36128 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:15.444266 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:15.444330 kernel: audit: type=1130 audit(1707813435.416:1433): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.67.89:22-139.178.68.195:36128 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:15.561000 audit[4366]: USER_ACCT pid=4366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.562409 sshd[4366]: Accepted publickey for core from 139.178.68.195 port 36128 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:15.563200 sshd[4366]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:15.565469 systemd-logind[1463]: New session 30 of user core. Feb 13 08:37:15.565948 systemd[1]: Started session-30.scope. Feb 13 08:37:15.645773 sshd[4366]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:15.647247 systemd[1]: sshd@40-145.40.67.89:22-139.178.68.195:36128.service: Deactivated successfully. Feb 13 08:37:15.647667 systemd[1]: session-30.scope: Deactivated successfully. Feb 13 08:37:15.648007 systemd-logind[1463]: Session 30 logged out. Waiting for processes to exit. Feb 13 08:37:15.648541 systemd-logind[1463]: Removed session 30. Feb 13 08:37:15.562000 audit[4366]: CRED_ACQ pid=4366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.744635 kernel: audit: type=1101 audit(1707813435.561:1434): pid=4366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.744670 kernel: audit: type=1103 audit(1707813435.562:1435): pid=4366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.744689 kernel: audit: type=1006 audit(1707813435.562:1436): pid=4366 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Feb 13 08:37:15.803280 kernel: audit: type=1300 audit(1707813435.562:1436): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6ce8cd80 a2=3 a3=0 items=0 ppid=1 pid=4366 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:15.562000 audit[4366]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6ce8cd80 a2=3 a3=0 items=0 ppid=1 pid=4366 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:15.815915 systemd[1]: Started sshd@41-145.40.67.89:22-161.35.108.241:40058.service. Feb 13 08:37:15.895417 kernel: audit: type=1327 audit(1707813435.562:1436): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:15.562000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:15.925938 kernel: audit: type=1105 audit(1707813435.567:1437): pid=4366 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.567000 audit[4366]: USER_START pid=4366 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:16.020483 kernel: audit: type=1103 audit(1707813435.568:1438): pid=4368 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.568000 audit[4368]: CRED_ACQ pid=4368 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:16.109821 kernel: audit: type=1106 audit(1707813435.645:1439): pid=4366 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.645000 audit[4366]: USER_END pid=4366 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:16.205438 kernel: audit: type=1104 audit(1707813435.645:1440): pid=4366 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:15.645000 audit[4366]: CRED_DISP pid=4366 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:16.246423 sshd[4391]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:37:15.646000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-145.40.67.89:22-139.178.68.195:36128 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:15.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.67.89:22-161.35.108.241:40058 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:16.245000 audit[4391]: USER_AUTH pid=4391 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:37:17.264310 kubelet[2614]: E0213 08:37:17.264260 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:17.345846 kubelet[2614]: E0213 08:37:17.345776 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.345846 kubelet[2614]: W0213 08:37:17.345823 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.346261 kubelet[2614]: E0213 08:37:17.345870 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.346503 kubelet[2614]: E0213 08:37:17.346408 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.346503 kubelet[2614]: W0213 08:37:17.346442 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.346503 kubelet[2614]: E0213 08:37:17.346480 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.347030 kubelet[2614]: E0213 08:37:17.346999 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.347030 kubelet[2614]: W0213 08:37:17.347027 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.347280 kubelet[2614]: E0213 08:37:17.347062 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.347673 kubelet[2614]: E0213 08:37:17.347627 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.347673 kubelet[2614]: W0213 08:37:17.347661 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.348073 kubelet[2614]: E0213 08:37:17.347705 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.348333 kubelet[2614]: E0213 08:37:17.348241 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.348333 kubelet[2614]: W0213 08:37:17.348274 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.348333 kubelet[2614]: E0213 08:37:17.348314 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.348841 kubelet[2614]: E0213 08:37:17.348808 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.348991 kubelet[2614]: W0213 08:37:17.348843 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.348991 kubelet[2614]: E0213 08:37:17.348882 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.349532 kubelet[2614]: E0213 08:37:17.349485 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.349532 kubelet[2614]: W0213 08:37:17.349520 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.349882 kubelet[2614]: E0213 08:37:17.349563 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.350145 kubelet[2614]: E0213 08:37:17.350072 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.350145 kubelet[2614]: W0213 08:37:17.350098 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.350145 kubelet[2614]: E0213 08:37:17.350136 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.350699 kubelet[2614]: E0213 08:37:17.350645 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.350699 kubelet[2614]: W0213 08:37:17.350680 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.350987 kubelet[2614]: E0213 08:37:17.350719 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.351375 kubelet[2614]: E0213 08:37:17.351327 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.351375 kubelet[2614]: W0213 08:37:17.351360 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.351705 kubelet[2614]: E0213 08:37:17.351400 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.351911 kubelet[2614]: E0213 08:37:17.351884 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.351911 kubelet[2614]: W0213 08:37:17.351910 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.352222 kubelet[2614]: E0213 08:37:17.351983 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.352528 kubelet[2614]: E0213 08:37:17.352481 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.352528 kubelet[2614]: W0213 08:37:17.352515 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.352857 kubelet[2614]: E0213 08:37:17.352555 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.353195 kubelet[2614]: E0213 08:37:17.353162 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.353348 kubelet[2614]: W0213 08:37:17.353199 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.353348 kubelet[2614]: E0213 08:37:17.353240 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.353782 kubelet[2614]: E0213 08:37:17.353733 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.353782 kubelet[2614]: W0213 08:37:17.353759 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.354087 kubelet[2614]: E0213 08:37:17.353793 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.354384 kubelet[2614]: E0213 08:37:17.354309 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.354384 kubelet[2614]: W0213 08:37:17.354343 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.354384 kubelet[2614]: E0213 08:37:17.354382 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.354911 kubelet[2614]: E0213 08:37:17.354883 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.355109 kubelet[2614]: W0213 08:37:17.354914 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.355109 kubelet[2614]: E0213 08:37:17.354986 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.355561 kubelet[2614]: E0213 08:37:17.355504 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.355561 kubelet[2614]: W0213 08:37:17.355537 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.355843 kubelet[2614]: E0213 08:37:17.355576 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.356184 kubelet[2614]: E0213 08:37:17.356138 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.356184 kubelet[2614]: W0213 08:37:17.356173 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.356539 kubelet[2614]: E0213 08:37:17.356219 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.356740 kubelet[2614]: E0213 08:37:17.356713 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.356740 kubelet[2614]: W0213 08:37:17.356739 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.357022 kubelet[2614]: E0213 08:37:17.356774 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:17.357317 kubelet[2614]: E0213 08:37:17.357263 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:17.357317 kubelet[2614]: W0213 08:37:17.357296 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:17.357557 kubelet[2614]: E0213 08:37:17.357336 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:18.252170 sshd[4391]: Failed password for root from 161.35.108.241 port 40058 ssh2 Feb 13 08:37:19.175250 sshd[4391]: Received disconnect from 161.35.108.241 port 40058:11: Bye Bye [preauth] Feb 13 08:37:19.175250 sshd[4391]: Disconnected from authenticating user root 161.35.108.241 port 40058 [preauth] Feb 13 08:37:19.177784 systemd[1]: sshd@41-145.40.67.89:22-161.35.108.241:40058.service: Deactivated successfully. Feb 13 08:37:19.177000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-145.40.67.89:22-161.35.108.241:40058 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:19.265465 kubelet[2614]: E0213 08:37:19.265398 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:20.225863 systemd[1]: Started sshd@42-145.40.67.89:22-43.153.15.221:41924.service. Feb 13 08:37:20.225000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.67.89:22-43.153.15.221:41924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:20.340177 sshd[4416]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:37:20.339000 audit[4416]: USER_AUTH pid=4416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:37:20.403170 kubelet[2614]: E0213 08:37:20.403115 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:20.654852 systemd[1]: Started sshd@43-145.40.67.89:22-139.178.68.195:59288.service. Feb 13 08:37:20.654000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.67.89:22-139.178.68.195:59288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:20.681707 kernel: kauditd_printk_skb: 6 callbacks suppressed Feb 13 08:37:20.681797 kernel: audit: type=1130 audit(1707813440.654:1447): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.67.89:22-139.178.68.195:59288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:20.710435 sshd[4419]: Accepted publickey for core from 139.178.68.195 port 59288 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:20.713262 sshd[4419]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:20.715613 systemd-logind[1463]: New session 31 of user core. Feb 13 08:37:20.716127 systemd[1]: Started session-31.scope. Feb 13 08:37:20.709000 audit[4419]: USER_ACCT pid=4419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.795804 sshd[4419]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:20.797347 systemd[1]: sshd@43-145.40.67.89:22-139.178.68.195:59288.service: Deactivated successfully. Feb 13 08:37:20.797763 systemd[1]: session-31.scope: Deactivated successfully. Feb 13 08:37:20.798225 systemd-logind[1463]: Session 31 logged out. Waiting for processes to exit. Feb 13 08:37:20.798758 systemd-logind[1463]: Removed session 31. Feb 13 08:37:20.862773 kernel: audit: type=1101 audit(1707813440.709:1448): pid=4419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.862809 kernel: audit: type=1103 audit(1707813440.712:1449): pid=4419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.712000 audit[4419]: CRED_ACQ pid=4419 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:21.012008 kernel: audit: type=1006 audit(1707813440.712:1450): pid=4419 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Feb 13 08:37:21.012043 kernel: audit: type=1300 audit(1707813440.712:1450): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffccf059f60 a2=3 a3=0 items=0 ppid=1 pid=4419 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:20.712000 audit[4419]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffccf059f60 a2=3 a3=0 items=0 ppid=1 pid=4419 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:21.104119 kernel: audit: type=1327 audit(1707813440.712:1450): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:20.712000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:21.134646 kernel: audit: type=1105 audit(1707813440.717:1451): pid=4419 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.717000 audit[4419]: USER_START pid=4419 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:21.229201 kernel: audit: type=1103 audit(1707813440.718:1452): pid=4421 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.718000 audit[4421]: CRED_ACQ pid=4421 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:21.264392 kubelet[2614]: E0213 08:37:21.264352 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:21.318501 kernel: audit: type=1106 audit(1707813440.795:1453): pid=4419 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.795000 audit[4419]: USER_END pid=4419 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:21.414039 kernel: audit: type=1104 audit(1707813440.795:1454): pid=4419 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.795000 audit[4419]: CRED_DISP pid=4419 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:20.796000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-145.40.67.89:22-139.178.68.195:59288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:22.226184 sshd[4416]: Failed password for root from 43.153.15.221 port 41924 ssh2 Feb 13 08:37:23.211109 sshd[4416]: Received disconnect from 43.153.15.221 port 41924:11: Bye Bye [preauth] Feb 13 08:37:23.211109 sshd[4416]: Disconnected from authenticating user root 43.153.15.221 port 41924 [preauth] Feb 13 08:37:23.213682 systemd[1]: sshd@42-145.40.67.89:22-43.153.15.221:41924.service: Deactivated successfully. Feb 13 08:37:23.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-145.40.67.89:22-43.153.15.221:41924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:23.265129 kubelet[2614]: E0213 08:37:23.265080 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:25.265565 kubelet[2614]: E0213 08:37:25.265442 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:25.405368 kubelet[2614]: E0213 08:37:25.405255 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:25.805564 systemd[1]: Started sshd@44-145.40.67.89:22-139.178.68.195:59300.service. Feb 13 08:37:25.804000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.67.89:22-139.178.68.195:59300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:25.832639 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:37:25.832678 kernel: audit: type=1130 audit(1707813445.804:1457): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.67.89:22-139.178.68.195:59300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:25.949000 audit[4448]: USER_ACCT pid=4448 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:25.950485 sshd[4448]: Accepted publickey for core from 139.178.68.195 port 59300 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:25.952230 sshd[4448]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:25.954527 systemd-logind[1463]: New session 32 of user core. Feb 13 08:37:25.955063 systemd[1]: Started session-32.scope. Feb 13 08:37:26.032658 sshd[4448]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:26.034031 systemd[1]: sshd@44-145.40.67.89:22-139.178.68.195:59300.service: Deactivated successfully. Feb 13 08:37:26.034438 systemd[1]: session-32.scope: Deactivated successfully. Feb 13 08:37:26.034781 systemd-logind[1463]: Session 32 logged out. Waiting for processes to exit. Feb 13 08:37:26.035314 systemd-logind[1463]: Removed session 32. Feb 13 08:37:25.951000 audit[4448]: CRED_ACQ pid=4448 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.133071 kernel: audit: type=1101 audit(1707813445.949:1458): pid=4448 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.133107 kernel: audit: type=1103 audit(1707813445.951:1459): pid=4448 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.133123 kernel: audit: type=1006 audit(1707813445.951:1460): pid=4448 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Feb 13 08:37:26.191683 kernel: audit: type=1300 audit(1707813445.951:1460): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe519b640 a2=3 a3=0 items=0 ppid=1 pid=4448 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:25.951000 audit[4448]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffe519b640 a2=3 a3=0 items=0 ppid=1 pid=4448 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:26.283738 kernel: audit: type=1327 audit(1707813445.951:1460): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:25.951000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:26.314245 kernel: audit: type=1105 audit(1707813445.956:1461): pid=4448 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:25.956000 audit[4448]: USER_START pid=4448 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.408752 kernel: audit: type=1103 audit(1707813445.957:1462): pid=4450 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:25.957000 audit[4450]: CRED_ACQ pid=4450 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.498146 kernel: audit: type=1106 audit(1707813446.032:1463): pid=4448 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.032000 audit[4448]: USER_END pid=4448 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.593674 kernel: audit: type=1104 audit(1707813446.032:1464): pid=4448 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.032000 audit[4448]: CRED_DISP pid=4448 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:26.033000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-145.40.67.89:22-139.178.68.195:59300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:27.265105 kubelet[2614]: E0213 08:37:27.265086 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:29.265277 kubelet[2614]: E0213 08:37:29.265211 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:30.407208 kubelet[2614]: E0213 08:37:30.407104 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:31.042257 systemd[1]: Started sshd@45-145.40.67.89:22-139.178.68.195:55506.service. Feb 13 08:37:31.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.67.89:22-139.178.68.195:55506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:31.069235 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:31.069286 kernel: audit: type=1130 audit(1707813451.041:1466): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.67.89:22-139.178.68.195:55506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:31.186000 audit[4474]: USER_ACCT pid=4474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.187481 sshd[4474]: Accepted publickey for core from 139.178.68.195 port 55506 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:31.188816 sshd[4474]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:31.191600 systemd-logind[1463]: New session 33 of user core. Feb 13 08:37:31.192343 systemd[1]: Started session-33.scope. Feb 13 08:37:31.264415 kubelet[2614]: E0213 08:37:31.264394 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:31.272010 sshd[4474]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:31.273527 systemd[1]: sshd@45-145.40.67.89:22-139.178.68.195:55506.service: Deactivated successfully. Feb 13 08:37:31.273962 systemd[1]: session-33.scope: Deactivated successfully. Feb 13 08:37:31.274331 systemd-logind[1463]: Session 33 logged out. Waiting for processes to exit. Feb 13 08:37:31.274810 systemd-logind[1463]: Removed session 33. Feb 13 08:37:31.187000 audit[4474]: CRED_ACQ pid=4474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.369395 kernel: audit: type=1101 audit(1707813451.186:1467): pid=4474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.369458 kernel: audit: type=1103 audit(1707813451.187:1468): pid=4474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.369479 kernel: audit: type=1006 audit(1707813451.187:1469): pid=4474 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Feb 13 08:37:31.187000 audit[4474]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe949f4730 a2=3 a3=0 items=0 ppid=1 pid=4474 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:31.520090 kernel: audit: type=1300 audit(1707813451.187:1469): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe949f4730 a2=3 a3=0 items=0 ppid=1 pid=4474 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:31.520151 kernel: audit: type=1327 audit(1707813451.187:1469): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:31.187000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:31.550630 kernel: audit: type=1105 audit(1707813451.193:1470): pid=4474 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.193000 audit[4474]: USER_START pid=4474 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.645295 kernel: audit: type=1103 audit(1707813451.194:1471): pid=4476 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.194000 audit[4476]: CRED_ACQ pid=4476 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.734580 kernel: audit: type=1106 audit(1707813451.271:1472): pid=4474 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.271000 audit[4474]: USER_END pid=4474 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.830160 kernel: audit: type=1104 audit(1707813451.271:1473): pid=4474 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.271000 audit[4474]: CRED_DISP pid=4474 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:31.272000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-145.40.67.89:22-139.178.68.195:55506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:33.264825 kubelet[2614]: E0213 08:37:33.264805 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:35.264858 kubelet[2614]: E0213 08:37:35.264791 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:35.408074 kubelet[2614]: E0213 08:37:35.408056 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:35.608000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.608000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00233bec0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:37:35.608000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:37:35.608000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.608000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0024a1420 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:37:35.608000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:37:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c004c98340 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:37:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:37:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0047b6220 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:37:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:37:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c002f2e4e0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:37:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:37:35.891000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c00f59aa50 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:37:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:37:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c008276930 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:37:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:37:35.891000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0071ae000 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:37:35.891000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:37:36.283510 systemd[1]: Started sshd@46-145.40.67.89:22-139.178.68.195:39926.service. Feb 13 08:37:36.283000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.67.89:22-139.178.68.195:39926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:36.326711 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:37:36.326823 kernel: audit: type=1130 audit(1707813456.283:1483): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.67.89:22-139.178.68.195:39926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:36.443000 audit[4501]: USER_ACCT pid=4501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.444475 sshd[4501]: Accepted publickey for core from 139.178.68.195 port 39926 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:36.445677 sshd[4501]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:36.447987 systemd-logind[1463]: New session 34 of user core. Feb 13 08:37:36.448545 systemd[1]: Started session-34.scope. Feb 13 08:37:36.526028 sshd[4501]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:36.527517 systemd[1]: sshd@46-145.40.67.89:22-139.178.68.195:39926.service: Deactivated successfully. Feb 13 08:37:36.527955 systemd[1]: session-34.scope: Deactivated successfully. Feb 13 08:37:36.528393 systemd-logind[1463]: Session 34 logged out. Waiting for processes to exit. Feb 13 08:37:36.528840 systemd-logind[1463]: Removed session 34. Feb 13 08:37:36.444000 audit[4501]: CRED_ACQ pid=4501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.626352 kernel: audit: type=1101 audit(1707813456.443:1484): pid=4501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.626426 kernel: audit: type=1103 audit(1707813456.444:1485): pid=4501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.626450 kernel: audit: type=1006 audit(1707813456.444:1486): pid=4501 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Feb 13 08:37:36.444000 audit[4501]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc384b5b80 a2=3 a3=0 items=0 ppid=1 pid=4501 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:36.777100 kernel: audit: type=1300 audit(1707813456.444:1486): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc384b5b80 a2=3 a3=0 items=0 ppid=1 pid=4501 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:36.777142 kernel: audit: type=1327 audit(1707813456.444:1486): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:36.444000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:36.807612 kernel: audit: type=1105 audit(1707813456.449:1487): pid=4501 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.449000 audit[4501]: USER_START pid=4501 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.902167 kernel: audit: type=1103 audit(1707813456.450:1488): pid=4503 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.450000 audit[4503]: CRED_ACQ pid=4503 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.991445 kernel: audit: type=1106 audit(1707813456.525:1489): pid=4501 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.525000 audit[4501]: USER_END pid=4501 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:37.087022 kernel: audit: type=1104 audit(1707813456.525:1490): pid=4501 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.525000 audit[4501]: CRED_DISP pid=4501 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:36.526000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-145.40.67.89:22-139.178.68.195:39926 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:37.264457 kubelet[2614]: E0213 08:37:37.264415 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:39.264599 kubelet[2614]: E0213 08:37:39.264528 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:40.409242 kubelet[2614]: E0213 08:37:40.409134 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:40.948000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:40.948000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00103c4c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:37:40.948000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:37:40.948000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:40.948000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002e8b5e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:37:40.948000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:37:40.949000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:40.949000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00103c4e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:37:40.949000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:37:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:37:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002684fa0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:37:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:37:41.265114 kubelet[2614]: E0213 08:37:41.265042 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:41.535724 systemd[1]: Started sshd@47-145.40.67.89:22-139.178.68.195:39938.service. Feb 13 08:37:41.534000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.67.89:22-139.178.68.195:39938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:41.562829 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:37:41.562870 kernel: audit: type=1130 audit(1707813461.534:1496): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.67.89:22-139.178.68.195:39938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:41.680000 audit[4528]: USER_ACCT pid=4528 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.681522 sshd[4528]: Accepted publickey for core from 139.178.68.195 port 39938 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:41.684222 sshd[4528]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:41.686636 systemd-logind[1463]: New session 35 of user core. Feb 13 08:37:41.687144 systemd[1]: Started session-35.scope. Feb 13 08:37:41.764851 sshd[4528]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:41.766337 systemd[1]: sshd@47-145.40.67.89:22-139.178.68.195:39938.service: Deactivated successfully. Feb 13 08:37:41.766762 systemd[1]: session-35.scope: Deactivated successfully. Feb 13 08:37:41.767150 systemd-logind[1463]: Session 35 logged out. Waiting for processes to exit. Feb 13 08:37:41.767714 systemd-logind[1463]: Removed session 35. Feb 13 08:37:41.683000 audit[4528]: CRED_ACQ pid=4528 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.863516 kernel: audit: type=1101 audit(1707813461.680:1497): pid=4528 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.863551 kernel: audit: type=1103 audit(1707813461.683:1498): pid=4528 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.863566 kernel: audit: type=1006 audit(1707813461.683:1499): pid=4528 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Feb 13 08:37:41.922121 kernel: audit: type=1300 audit(1707813461.683:1499): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb3b0a0e0 a2=3 a3=0 items=0 ppid=1 pid=4528 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:41.683000 audit[4528]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb3b0a0e0 a2=3 a3=0 items=0 ppid=1 pid=4528 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:42.014226 kernel: audit: type=1327 audit(1707813461.683:1499): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:41.683000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:42.044751 kernel: audit: type=1105 audit(1707813461.688:1500): pid=4528 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.688000 audit[4528]: USER_START pid=4528 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:42.139323 kernel: audit: type=1103 audit(1707813461.689:1501): pid=4530 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.689000 audit[4530]: CRED_ACQ pid=4530 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:42.228570 kernel: audit: type=1106 audit(1707813461.764:1502): pid=4528 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.764000 audit[4528]: USER_END pid=4528 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:42.324147 kernel: audit: type=1104 audit(1707813461.764:1503): pid=4528 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.764000 audit[4528]: CRED_DISP pid=4528 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:41.765000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-145.40.67.89:22-139.178.68.195:39938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:43.265003 kubelet[2614]: E0213 08:37:43.264984 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:44.364192 kubelet[2614]: E0213 08:37:44.364124 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.364192 kubelet[2614]: W0213 08:37:44.364175 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.365429 kubelet[2614]: E0213 08:37:44.364247 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.365429 kubelet[2614]: E0213 08:37:44.364795 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.365429 kubelet[2614]: W0213 08:37:44.364831 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.365429 kubelet[2614]: E0213 08:37:44.364888 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.366155 kubelet[2614]: E0213 08:37:44.365464 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.366155 kubelet[2614]: W0213 08:37:44.365500 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.366155 kubelet[2614]: E0213 08:37:44.365557 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.366663 kubelet[2614]: E0213 08:37:44.366264 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.366663 kubelet[2614]: W0213 08:37:44.366299 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.366663 kubelet[2614]: E0213 08:37:44.366352 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.367212 kubelet[2614]: E0213 08:37:44.366896 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.367212 kubelet[2614]: W0213 08:37:44.366954 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.367212 kubelet[2614]: E0213 08:37:44.367010 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.367666 kubelet[2614]: E0213 08:37:44.367528 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.367666 kubelet[2614]: W0213 08:37:44.367563 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.367666 kubelet[2614]: E0213 08:37:44.367614 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.368218 kubelet[2614]: E0213 08:37:44.368169 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.368218 kubelet[2614]: W0213 08:37:44.368200 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.368586 kubelet[2614]: E0213 08:37:44.368250 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.368801 kubelet[2614]: E0213 08:37:44.368764 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.369024 kubelet[2614]: W0213 08:37:44.368799 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.369024 kubelet[2614]: E0213 08:37:44.368848 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.433626 kubelet[2614]: E0213 08:37:44.433562 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.433626 kubelet[2614]: W0213 08:37:44.433585 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.433626 kubelet[2614]: E0213 08:37:44.433614 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.434046 kubelet[2614]: E0213 08:37:44.433991 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.434046 kubelet[2614]: W0213 08:37:44.434007 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.434046 kubelet[2614]: E0213 08:37:44.434028 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.434390 kubelet[2614]: E0213 08:37:44.434331 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.434390 kubelet[2614]: W0213 08:37:44.434350 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.434390 kubelet[2614]: E0213 08:37:44.434372 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.434707 kubelet[2614]: E0213 08:37:44.434674 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.434707 kubelet[2614]: W0213 08:37:44.434692 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.434885 kubelet[2614]: E0213 08:37:44.434714 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.435168 kubelet[2614]: E0213 08:37:44.435104 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.435168 kubelet[2614]: W0213 08:37:44.435121 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.435168 kubelet[2614]: E0213 08:37:44.435145 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:44.435813 kubelet[2614]: E0213 08:37:44.435750 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:44.435813 kubelet[2614]: W0213 08:37:44.435769 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:44.435813 kubelet[2614]: E0213 08:37:44.435793 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:45.265158 kubelet[2614]: E0213 08:37:45.265112 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:45.410399 kubelet[2614]: E0213 08:37:45.410334 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:46.775016 systemd[1]: Started sshd@48-145.40.67.89:22-139.178.68.195:52260.service. Feb 13 08:37:46.774000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.67.89:22-139.178.68.195:52260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:46.801934 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:46.802008 kernel: audit: type=1130 audit(1707813466.774:1505): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.67.89:22-139.178.68.195:52260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:46.919000 audit[4569]: USER_ACCT pid=4569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:46.920404 sshd[4569]: Accepted publickey for core from 139.178.68.195 port 52260 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:46.922210 sshd[4569]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:46.924658 systemd-logind[1463]: New session 36 of user core. Feb 13 08:37:46.925368 systemd[1]: Started session-36.scope. Feb 13 08:37:47.005080 sshd[4569]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:47.006532 systemd[1]: sshd@48-145.40.67.89:22-139.178.68.195:52260.service: Deactivated successfully. Feb 13 08:37:47.006991 systemd[1]: session-36.scope: Deactivated successfully. Feb 13 08:37:47.007368 systemd-logind[1463]: Session 36 logged out. Waiting for processes to exit. Feb 13 08:37:47.007807 systemd-logind[1463]: Removed session 36. Feb 13 08:37:46.921000 audit[4569]: CRED_ACQ pid=4569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.104018 kernel: audit: type=1101 audit(1707813466.919:1506): pid=4569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.104057 kernel: audit: type=1103 audit(1707813466.921:1507): pid=4569 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.104072 kernel: audit: type=1006 audit(1707813466.921:1508): pid=4569 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Feb 13 08:37:46.921000 audit[4569]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffad37ac10 a2=3 a3=0 items=0 ppid=1 pid=4569 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:47.162933 kernel: audit: type=1300 audit(1707813466.921:1508): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffad37ac10 a2=3 a3=0 items=0 ppid=1 pid=4569 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:46.921000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:47.264114 kubelet[2614]: E0213 08:37:47.264103 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:47.285355 kernel: audit: type=1327 audit(1707813466.921:1508): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:47.285387 kernel: audit: type=1105 audit(1707813466.927:1509): pid=4569 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:46.927000 audit[4569]: USER_START pid=4569 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.379863 kernel: audit: type=1103 audit(1707813466.927:1510): pid=4571 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:46.927000 audit[4571]: CRED_ACQ pid=4571 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.469128 kernel: audit: type=1106 audit(1707813467.004:1511): pid=4569 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.004000 audit[4569]: USER_END pid=4569 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.564652 kernel: audit: type=1104 audit(1707813467.004:1512): pid=4569 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.004000 audit[4569]: CRED_DISP pid=4569 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:47.005000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-145.40.67.89:22-139.178.68.195:52260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:49.265172 kubelet[2614]: E0213 08:37:49.265105 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:50.412077 kubelet[2614]: E0213 08:37:50.411991 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:51.264930 kubelet[2614]: E0213 08:37:51.264880 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:51.321415 kubelet[2614]: E0213 08:37:51.321346 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.321415 kubelet[2614]: W0213 08:37:51.321395 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.321884 kubelet[2614]: E0213 08:37:51.321456 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.322114 kubelet[2614]: E0213 08:37:51.322006 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.322114 kubelet[2614]: W0213 08:37:51.322041 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.322114 kubelet[2614]: E0213 08:37:51.322090 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.322611 kubelet[2614]: E0213 08:37:51.322573 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.322806 kubelet[2614]: W0213 08:37:51.322609 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.322806 kubelet[2614]: E0213 08:37:51.322658 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.323227 kubelet[2614]: E0213 08:37:51.323189 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.323227 kubelet[2614]: W0213 08:37:51.323219 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.323598 kubelet[2614]: E0213 08:37:51.323268 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.323824 kubelet[2614]: E0213 08:37:51.323802 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.323981 kubelet[2614]: W0213 08:37:51.323834 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.323981 kubelet[2614]: E0213 08:37:51.323875 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.324465 kubelet[2614]: E0213 08:37:51.324391 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.324465 kubelet[2614]: W0213 08:37:51.324423 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.324465 kubelet[2614]: E0213 08:37:51.324462 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.325092 kubelet[2614]: E0213 08:37:51.325023 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.325092 kubelet[2614]: W0213 08:37:51.325048 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.325092 kubelet[2614]: E0213 08:37:51.325084 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.325635 kubelet[2614]: E0213 08:37:51.325560 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.325635 kubelet[2614]: W0213 08:37:51.325593 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.325635 kubelet[2614]: E0213 08:37:51.325631 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.326273 kubelet[2614]: E0213 08:37:51.326181 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.326273 kubelet[2614]: W0213 08:37:51.326213 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.326273 kubelet[2614]: E0213 08:37:51.326256 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.326867 kubelet[2614]: E0213 08:37:51.326797 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.326867 kubelet[2614]: W0213 08:37:51.326835 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.326867 kubelet[2614]: E0213 08:37:51.326873 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.327504 kubelet[2614]: E0213 08:37:51.327415 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.327504 kubelet[2614]: W0213 08:37:51.327448 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.327504 kubelet[2614]: E0213 08:37:51.327487 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:51.328069 kubelet[2614]: E0213 08:37:51.327987 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:37:51.328069 kubelet[2614]: W0213 08:37:51.328013 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:37:51.328069 kubelet[2614]: E0213 08:37:51.328047 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:37:52.013908 systemd[1]: Started sshd@49-145.40.67.89:22-139.178.68.195:52262.service. Feb 13 08:37:52.013000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.67.89:22-139.178.68.195:52262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:52.041083 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:52.041174 kernel: audit: type=1130 audit(1707813472.013:1514): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.67.89:22-139.178.68.195:52262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:52.158000 audit[4608]: USER_ACCT pid=4608 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.159247 sshd[4608]: Accepted publickey for core from 139.178.68.195 port 52262 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:52.164215 sshd[4608]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:52.166579 systemd-logind[1463]: New session 37 of user core. Feb 13 08:37:52.167075 systemd[1]: Started session-37.scope. Feb 13 08:37:52.243907 sshd[4608]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:52.245256 systemd[1]: sshd@49-145.40.67.89:22-139.178.68.195:52262.service: Deactivated successfully. Feb 13 08:37:52.245682 systemd[1]: session-37.scope: Deactivated successfully. Feb 13 08:37:52.246002 systemd-logind[1463]: Session 37 logged out. Waiting for processes to exit. Feb 13 08:37:52.246477 systemd-logind[1463]: Removed session 37. Feb 13 08:37:52.163000 audit[4608]: CRED_ACQ pid=4608 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.340994 kernel: audit: type=1101 audit(1707813472.158:1515): pid=4608 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.341048 kernel: audit: type=1103 audit(1707813472.163:1516): pid=4608 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.341067 kernel: audit: type=1006 audit(1707813472.163:1517): pid=4608 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Feb 13 08:37:52.399551 kernel: audit: type=1300 audit(1707813472.163:1517): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0aaf5480 a2=3 a3=0 items=0 ppid=1 pid=4608 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:52.163000 audit[4608]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0aaf5480 a2=3 a3=0 items=0 ppid=1 pid=4608 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:52.491527 kernel: audit: type=1327 audit(1707813472.163:1517): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:52.163000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:52.522031 kernel: audit: type=1105 audit(1707813472.168:1518): pid=4608 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.168000 audit[4608]: USER_START pid=4608 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.616482 kernel: audit: type=1103 audit(1707813472.169:1519): pid=4610 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.169000 audit[4610]: CRED_ACQ pid=4610 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.705774 kernel: audit: type=1106 audit(1707813472.243:1520): pid=4608 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.243000 audit[4608]: USER_END pid=4608 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.801231 kernel: audit: type=1104 audit(1707813472.243:1521): pid=4608 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.243000 audit[4608]: CRED_DISP pid=4608 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:52.244000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-145.40.67.89:22-139.178.68.195:52262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:53.264816 kubelet[2614]: E0213 08:37:53.264775 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:55.264273 kubelet[2614]: E0213 08:37:55.264229 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:55.413602 kubelet[2614]: E0213 08:37:55.413500 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:37:57.253609 systemd[1]: Started sshd@50-145.40.67.89:22-139.178.68.195:35482.service. Feb 13 08:37:57.252000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.67.89:22-139.178.68.195:35482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:57.264816 kubelet[2614]: E0213 08:37:57.264775 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:37:57.280830 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:37:57.280869 kernel: audit: type=1130 audit(1707813477.252:1523): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.67.89:22-139.178.68.195:35482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:57.398000 audit[4635]: USER_ACCT pid=4635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.399400 sshd[4635]: Accepted publickey for core from 139.178.68.195 port 35482 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:37:57.400633 sshd[4635]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:37:57.403165 systemd-logind[1463]: New session 38 of user core. Feb 13 08:37:57.403595 systemd[1]: Started session-38.scope. Feb 13 08:37:57.481221 sshd[4635]: pam_unix(sshd:session): session closed for user core Feb 13 08:37:57.482644 systemd[1]: sshd@50-145.40.67.89:22-139.178.68.195:35482.service: Deactivated successfully. Feb 13 08:37:57.483069 systemd[1]: session-38.scope: Deactivated successfully. Feb 13 08:37:57.483458 systemd-logind[1463]: Session 38 logged out. Waiting for processes to exit. Feb 13 08:37:57.483901 systemd-logind[1463]: Removed session 38. Feb 13 08:37:57.399000 audit[4635]: CRED_ACQ pid=4635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.581187 kernel: audit: type=1101 audit(1707813477.398:1524): pid=4635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.581221 kernel: audit: type=1103 audit(1707813477.399:1525): pid=4635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.581236 kernel: audit: type=1006 audit(1707813477.399:1526): pid=4635 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=38 res=1 Feb 13 08:37:57.639772 kernel: audit: type=1300 audit(1707813477.399:1526): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3be71800 a2=3 a3=0 items=0 ppid=1 pid=4635 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:57.399000 audit[4635]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3be71800 a2=3 a3=0 items=0 ppid=1 pid=4635 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:37:57.731912 kernel: audit: type=1327 audit(1707813477.399:1526): proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:57.399000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:37:57.762407 kernel: audit: type=1105 audit(1707813477.404:1527): pid=4635 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.404000 audit[4635]: USER_START pid=4635 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.856932 kernel: audit: type=1103 audit(1707813477.405:1528): pid=4637 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.405000 audit[4637]: CRED_ACQ pid=4637 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.946164 kernel: audit: type=1106 audit(1707813477.480:1529): pid=4635 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.480000 audit[4635]: USER_END pid=4635 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:58.041724 kernel: audit: type=1104 audit(1707813477.481:1530): pid=4635 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.481000 audit[4635]: CRED_DISP pid=4635 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:37:57.481000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-145.40.67.89:22-139.178.68.195:35482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:37:59.265091 kubelet[2614]: E0213 08:37:59.265065 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:00.415588 kubelet[2614]: E0213 08:38:00.415437 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:01.264956 kubelet[2614]: E0213 08:38:01.264879 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:02.491748 systemd[1]: Started sshd@51-145.40.67.89:22-139.178.68.195:35494.service. Feb 13 08:38:02.491000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.67.89:22-139.178.68.195:35494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:02.518768 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:02.518966 kernel: audit: type=1130 audit(1707813482.491:1532): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.67.89:22-139.178.68.195:35494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:02.635000 audit[4661]: USER_ACCT pid=4661 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.636410 sshd[4661]: Accepted publickey for core from 139.178.68.195 port 35494 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:02.640232 sshd[4661]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:02.642552 systemd-logind[1463]: New session 39 of user core. Feb 13 08:38:02.643034 systemd[1]: Started session-39.scope. Feb 13 08:38:02.721640 sshd[4661]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:02.722974 systemd[1]: sshd@51-145.40.67.89:22-139.178.68.195:35494.service: Deactivated successfully. Feb 13 08:38:02.723385 systemd[1]: session-39.scope: Deactivated successfully. Feb 13 08:38:02.723758 systemd-logind[1463]: Session 39 logged out. Waiting for processes to exit. Feb 13 08:38:02.724337 systemd-logind[1463]: Removed session 39. Feb 13 08:38:02.639000 audit[4661]: CRED_ACQ pid=4661 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.819571 kernel: audit: type=1101 audit(1707813482.635:1533): pid=4661 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.819654 kernel: audit: type=1103 audit(1707813482.639:1534): pid=4661 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.819673 kernel: audit: type=1006 audit(1707813482.639:1535): pid=4661 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Feb 13 08:38:02.878417 kernel: audit: type=1300 audit(1707813482.639:1535): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc863a4770 a2=3 a3=0 items=0 ppid=1 pid=4661 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:02.639000 audit[4661]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc863a4770 a2=3 a3=0 items=0 ppid=1 pid=4661 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:02.970551 kernel: audit: type=1327 audit(1707813482.639:1535): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:02.639000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:02.644000 audit[4661]: USER_START pid=4661 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:03.095737 kernel: audit: type=1105 audit(1707813482.644:1536): pid=4661 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:03.095777 kernel: audit: type=1103 audit(1707813482.644:1537): pid=4663 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.644000 audit[4663]: CRED_ACQ pid=4663 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.721000 audit[4661]: USER_END pid=4661 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:03.264377 kubelet[2614]: E0213 08:38:03.264344 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:03.280770 kernel: audit: type=1106 audit(1707813482.721:1538): pid=4661 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:03.280801 kernel: audit: type=1104 audit(1707813482.721:1539): pid=4661 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.721000 audit[4661]: CRED_DISP pid=4661 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:02.722000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-145.40.67.89:22-139.178.68.195:35494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:05.264802 kubelet[2614]: E0213 08:38:05.264707 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:05.334175 kubelet[2614]: E0213 08:38:05.334088 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.334175 kubelet[2614]: W0213 08:38:05.334131 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.334175 kubelet[2614]: E0213 08:38:05.334179 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.334701 kubelet[2614]: E0213 08:38:05.334665 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.334701 kubelet[2614]: W0213 08:38:05.334699 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.334956 kubelet[2614]: E0213 08:38:05.334739 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.335308 kubelet[2614]: E0213 08:38:05.335230 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.335308 kubelet[2614]: W0213 08:38:05.335263 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.335308 kubelet[2614]: E0213 08:38:05.335301 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.335914 kubelet[2614]: E0213 08:38:05.335848 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.335914 kubelet[2614]: W0213 08:38:05.335882 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.335914 kubelet[2614]: E0213 08:38:05.335922 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.336566 kubelet[2614]: E0213 08:38:05.336474 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.336566 kubelet[2614]: W0213 08:38:05.336508 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.336566 kubelet[2614]: E0213 08:38:05.336548 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.336980 kubelet[2614]: E0213 08:38:05.336962 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.337119 kubelet[2614]: W0213 08:38:05.336986 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.337119 kubelet[2614]: E0213 08:38:05.337019 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.337645 kubelet[2614]: E0213 08:38:05.337556 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.337645 kubelet[2614]: W0213 08:38:05.337589 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.337645 kubelet[2614]: E0213 08:38:05.337633 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.338107 kubelet[2614]: E0213 08:38:05.338079 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.338107 kubelet[2614]: W0213 08:38:05.338104 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.338333 kubelet[2614]: E0213 08:38:05.338142 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.338719 kubelet[2614]: E0213 08:38:05.338630 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.338719 kubelet[2614]: W0213 08:38:05.338663 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.338719 kubelet[2614]: E0213 08:38:05.338702 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.339405 kubelet[2614]: E0213 08:38:05.339326 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.339405 kubelet[2614]: W0213 08:38:05.339359 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.339405 kubelet[2614]: E0213 08:38:05.339397 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.339870 kubelet[2614]: E0213 08:38:05.339842 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.339870 kubelet[2614]: W0213 08:38:05.339868 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.340151 kubelet[2614]: E0213 08:38:05.339903 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.340494 kubelet[2614]: E0213 08:38:05.340420 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.340494 kubelet[2614]: W0213 08:38:05.340455 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.340494 kubelet[2614]: E0213 08:38:05.340495 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.341053 kubelet[2614]: E0213 08:38:05.340973 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.341053 kubelet[2614]: W0213 08:38:05.341000 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.341053 kubelet[2614]: E0213 08:38:05.341033 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.341588 kubelet[2614]: E0213 08:38:05.341532 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.341588 kubelet[2614]: W0213 08:38:05.341567 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.341821 kubelet[2614]: E0213 08:38:05.341606 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.342127 kubelet[2614]: E0213 08:38:05.342041 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.342127 kubelet[2614]: W0213 08:38:05.342065 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.342127 kubelet[2614]: E0213 08:38:05.342097 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.342594 kubelet[2614]: E0213 08:38:05.342552 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.342713 kubelet[2614]: W0213 08:38:05.342595 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.342713 kubelet[2614]: E0213 08:38:05.342652 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.392490 kubelet[2614]: E0213 08:38:05.392385 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.392490 kubelet[2614]: W0213 08:38:05.392426 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.392490 kubelet[2614]: E0213 08:38:05.392474 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.393146 kubelet[2614]: E0213 08:38:05.393061 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.393146 kubelet[2614]: W0213 08:38:05.393088 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.393146 kubelet[2614]: E0213 08:38:05.393130 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.393808 kubelet[2614]: E0213 08:38:05.393717 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.393808 kubelet[2614]: W0213 08:38:05.393751 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.393808 kubelet[2614]: E0213 08:38:05.393797 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.394464 kubelet[2614]: E0213 08:38:05.394390 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.394464 kubelet[2614]: W0213 08:38:05.394423 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.394464 kubelet[2614]: E0213 08:38:05.394469 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.395038 kubelet[2614]: E0213 08:38:05.394969 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.395038 kubelet[2614]: W0213 08:38:05.394995 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.395338 kubelet[2614]: E0213 08:38:05.395135 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.395564 kubelet[2614]: E0213 08:38:05.395490 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.395564 kubelet[2614]: W0213 08:38:05.395523 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.395564 kubelet[2614]: E0213 08:38:05.395569 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.396048 kubelet[2614]: E0213 08:38:05.396018 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.396048 kubelet[2614]: W0213 08:38:05.396041 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.396277 kubelet[2614]: E0213 08:38:05.396079 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.396596 kubelet[2614]: E0213 08:38:05.396522 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.396596 kubelet[2614]: W0213 08:38:05.396547 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.396596 kubelet[2614]: E0213 08:38:05.396586 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.397065 kubelet[2614]: E0213 08:38:05.397030 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.397065 kubelet[2614]: W0213 08:38:05.397052 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.397288 kubelet[2614]: E0213 08:38:05.397183 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.397562 kubelet[2614]: E0213 08:38:05.397488 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.397562 kubelet[2614]: W0213 08:38:05.397512 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.397562 kubelet[2614]: E0213 08:38:05.397545 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.398204 kubelet[2614]: E0213 08:38:05.398131 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.398204 kubelet[2614]: W0213 08:38:05.398166 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.398204 kubelet[2614]: E0213 08:38:05.398211 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.398782 kubelet[2614]: E0213 08:38:05.398748 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:05.398909 kubelet[2614]: W0213 08:38:05.398784 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:05.398909 kubelet[2614]: E0213 08:38:05.398827 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:05.416546 kubelet[2614]: E0213 08:38:05.416458 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:07.264359 kubelet[2614]: E0213 08:38:07.264311 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:07.732457 systemd[1]: Started sshd@52-145.40.67.89:22-139.178.68.195:57848.service. Feb 13 08:38:07.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.67.89:22-139.178.68.195:57848 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:07.760162 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:07.760249 kernel: audit: type=1130 audit(1707813487.731:1541): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.67.89:22-139.178.68.195:57848 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:07.878000 audit[4716]: USER_ACCT pid=4716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:07.878393 sshd[4716]: Accepted publickey for core from 139.178.68.195 port 57848 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:07.879202 sshd[4716]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:07.881563 systemd-logind[1463]: New session 40 of user core. Feb 13 08:38:07.882136 systemd[1]: Started session-40.scope. Feb 13 08:38:07.961979 sshd[4716]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:07.963392 systemd[1]: sshd@52-145.40.67.89:22-139.178.68.195:57848.service: Deactivated successfully. Feb 13 08:38:07.963851 systemd[1]: session-40.scope: Deactivated successfully. Feb 13 08:38:07.964304 systemd-logind[1463]: Session 40 logged out. Waiting for processes to exit. Feb 13 08:38:07.964819 systemd-logind[1463]: Removed session 40. Feb 13 08:38:07.878000 audit[4716]: CRED_ACQ pid=4716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:08.062535 kernel: audit: type=1101 audit(1707813487.878:1542): pid=4716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:08.062565 kernel: audit: type=1103 audit(1707813487.878:1543): pid=4716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:08.062586 kernel: audit: type=1006 audit(1707813487.878:1544): pid=4716 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Feb 13 08:38:08.121147 kernel: audit: type=1300 audit(1707813487.878:1544): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa5cf4940 a2=3 a3=0 items=0 ppid=1 pid=4716 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:07.878000 audit[4716]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa5cf4940 a2=3 a3=0 items=0 ppid=1 pid=4716 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:08.213288 kernel: audit: type=1327 audit(1707813487.878:1544): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:07.878000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:08.243798 kernel: audit: type=1105 audit(1707813487.883:1545): pid=4716 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:07.883000 audit[4716]: USER_START pid=4716 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:08.338341 kernel: audit: type=1103 audit(1707813487.884:1546): pid=4718 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:07.884000 audit[4718]: CRED_ACQ pid=4718 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:08.427588 kernel: audit: type=1106 audit(1707813487.961:1547): pid=4716 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:07.961000 audit[4716]: USER_END pid=4716 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:08.523162 kernel: audit: type=1104 audit(1707813487.961:1548): pid=4716 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:07.961000 audit[4716]: CRED_DISP pid=4716 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:07.962000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-145.40.67.89:22-139.178.68.195:57848 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:09.265135 kubelet[2614]: E0213 08:38:09.265082 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:10.418168 kubelet[2614]: E0213 08:38:10.418069 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:11.264266 kubelet[2614]: E0213 08:38:11.264223 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:12.295452 kubelet[2614]: E0213 08:38:12.295399 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:12.295452 kubelet[2614]: W0213 08:38:12.295419 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:12.295452 kubelet[2614]: E0213 08:38:12.295437 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:12.295843 kubelet[2614]: E0213 08:38:12.295633 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:12.295843 kubelet[2614]: W0213 08:38:12.295644 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:12.295843 kubelet[2614]: E0213 08:38:12.295657 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:12.295970 kubelet[2614]: E0213 08:38:12.295850 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:12.295970 kubelet[2614]: W0213 08:38:12.295859 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:12.295970 kubelet[2614]: E0213 08:38:12.295870 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:12.296069 kubelet[2614]: E0213 08:38:12.296045 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:12.296069 kubelet[2614]: W0213 08:38:12.296056 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:12.296069 kubelet[2614]: E0213 08:38:12.296068 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:12.971589 systemd[1]: Started sshd@53-145.40.67.89:22-139.178.68.195:57858.service. Feb 13 08:38:12.970000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.67.89:22-139.178.68.195:57858 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:12.998664 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:12.998707 kernel: audit: type=1130 audit(1707813492.970:1550): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.67.89:22-139.178.68.195:57858 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:13.116000 audit[4749]: USER_ACCT pid=4749 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.118231 sshd[4749]: Accepted publickey for core from 139.178.68.195 port 57858 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:13.122215 sshd[4749]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:13.126794 systemd-logind[1463]: New session 41 of user core. Feb 13 08:38:13.127291 systemd[1]: Started session-41.scope. Feb 13 08:38:13.206030 sshd[4749]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:13.207374 systemd[1]: sshd@53-145.40.67.89:22-139.178.68.195:57858.service: Deactivated successfully. Feb 13 08:38:13.207788 systemd[1]: session-41.scope: Deactivated successfully. Feb 13 08:38:13.208185 systemd-logind[1463]: Session 41 logged out. Waiting for processes to exit. Feb 13 08:38:13.208665 systemd-logind[1463]: Removed session 41. Feb 13 08:38:13.120000 audit[4749]: CRED_ACQ pid=4749 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.265080 kubelet[2614]: E0213 08:38:13.265018 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:13.299750 kernel: audit: type=1101 audit(1707813493.116:1551): pid=4749 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.299786 kernel: audit: type=1103 audit(1707813493.120:1552): pid=4749 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.299803 kernel: audit: type=1006 audit(1707813493.120:1553): pid=4749 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Feb 13 08:38:13.358330 kernel: audit: type=1300 audit(1707813493.120:1553): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc40e6470 a2=3 a3=0 items=0 ppid=1 pid=4749 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:13.120000 audit[4749]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc40e6470 a2=3 a3=0 items=0 ppid=1 pid=4749 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:13.450307 kernel: audit: type=1327 audit(1707813493.120:1553): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:13.120000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:13.128000 audit[4749]: USER_START pid=4749 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.575243 kernel: audit: type=1105 audit(1707813493.128:1554): pid=4749 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.575270 kernel: audit: type=1103 audit(1707813493.128:1555): pid=4751 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.128000 audit[4751]: CRED_ACQ pid=4751 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.664437 kernel: audit: type=1106 audit(1707813493.205:1556): pid=4749 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.205000 audit[4749]: USER_END pid=4749 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.759891 kernel: audit: type=1104 audit(1707813493.205:1557): pid=4749 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.205000 audit[4749]: CRED_DISP pid=4749 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:13.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-145.40.67.89:22-139.178.68.195:57858 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:13.888817 systemd[1]: Started sshd@54-145.40.67.89:22-43.153.15.221:60756.service. Feb 13 08:38:13.887000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.67.89:22-43.153.15.221:60756 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:13.896368 systemd[1]: Started sshd@55-145.40.67.89:22-161.35.108.241:42588.service. Feb 13 08:38:13.895000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-145.40.67.89:22-161.35.108.241:42588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:14.010918 sshd[4775]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:38:14.010000 audit[4775]: USER_AUTH pid=4775 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:38:14.335472 sshd[4778]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:38:14.334000 audit[4778]: USER_AUTH pid=4778 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:38:15.265118 kubelet[2614]: E0213 08:38:15.265072 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:15.420182 kubelet[2614]: E0213 08:38:15.420089 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:15.977329 sshd[4775]: Failed password for root from 43.153.15.221 port 60756 ssh2 Feb 13 08:38:16.301923 sshd[4778]: Failed password for root from 161.35.108.241 port 42588 ssh2 Feb 13 08:38:16.880898 sshd[4775]: Received disconnect from 43.153.15.221 port 60756:11: Bye Bye [preauth] Feb 13 08:38:16.880898 sshd[4775]: Disconnected from authenticating user root 43.153.15.221 port 60756 [preauth] Feb 13 08:38:16.883470 systemd[1]: sshd@54-145.40.67.89:22-43.153.15.221:60756.service: Deactivated successfully. Feb 13 08:38:16.882000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-145.40.67.89:22-43.153.15.221:60756 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:17.259844 sshd[4778]: Received disconnect from 161.35.108.241 port 42588:11: Bye Bye [preauth] Feb 13 08:38:17.259844 sshd[4778]: Disconnected from authenticating user root 161.35.108.241 port 42588 [preauth] Feb 13 08:38:17.260559 systemd[1]: sshd@55-145.40.67.89:22-161.35.108.241:42588.service: Deactivated successfully. Feb 13 08:38:17.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-145.40.67.89:22-161.35.108.241:42588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:17.265028 kubelet[2614]: E0213 08:38:17.264998 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:18.215452 systemd[1]: Started sshd@56-145.40.67.89:22-139.178.68.195:45990.service. Feb 13 08:38:18.213000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.67.89:22-139.178.68.195:45990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:18.242621 kernel: kauditd_printk_skb: 7 callbacks suppressed Feb 13 08:38:18.242683 kernel: audit: type=1130 audit(1707813498.213:1565): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.67.89:22-139.178.68.195:45990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:18.359000 audit[4784]: USER_ACCT pid=4784 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.361078 sshd[4784]: Accepted publickey for core from 139.178.68.195 port 45990 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:18.363955 sshd[4784]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:18.368999 systemd-logind[1463]: New session 42 of user core. Feb 13 08:38:18.369528 systemd[1]: Started session-42.scope. Feb 13 08:38:18.446806 sshd[4784]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:18.448306 systemd[1]: sshd@56-145.40.67.89:22-139.178.68.195:45990.service: Deactivated successfully. Feb 13 08:38:18.448765 systemd[1]: session-42.scope: Deactivated successfully. Feb 13 08:38:18.449145 systemd-logind[1463]: Session 42 logged out. Waiting for processes to exit. Feb 13 08:38:18.449568 systemd-logind[1463]: Removed session 42. Feb 13 08:38:18.361000 audit[4784]: CRED_ACQ pid=4784 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.542570 kernel: audit: type=1101 audit(1707813498.359:1566): pid=4784 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.542618 kernel: audit: type=1103 audit(1707813498.361:1567): pid=4784 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.542635 kernel: audit: type=1006 audit(1707813498.361:1568): pid=4784 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Feb 13 08:38:18.601110 kernel: audit: type=1300 audit(1707813498.361:1568): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc9076b290 a2=3 a3=0 items=0 ppid=1 pid=4784 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:18.361000 audit[4784]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc9076b290 a2=3 a3=0 items=0 ppid=1 pid=4784 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:18.693054 kernel: audit: type=1327 audit(1707813498.361:1568): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:18.361000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:18.723551 kernel: audit: type=1105 audit(1707813498.369:1569): pid=4784 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.369000 audit[4784]: USER_START pid=4784 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.818013 kernel: audit: type=1103 audit(1707813498.370:1570): pid=4786 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.370000 audit[4786]: CRED_ACQ pid=4786 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.907213 kernel: audit: type=1106 audit(1707813498.445:1571): pid=4784 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.445000 audit[4784]: USER_END pid=4784 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:19.002687 kernel: audit: type=1104 audit(1707813498.445:1572): pid=4784 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.445000 audit[4784]: CRED_DISP pid=4784 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:18.446000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-145.40.67.89:22-139.178.68.195:45990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:19.265323 kubelet[2614]: E0213 08:38:19.265265 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:20.421322 kubelet[2614]: E0213 08:38:20.421256 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:21.265166 kubelet[2614]: E0213 08:38:21.265145 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:23.265122 kubelet[2614]: E0213 08:38:23.265096 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:23.459536 systemd[1]: Started sshd@57-145.40.67.89:22-139.178.68.195:46002.service. Feb 13 08:38:23.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.67.89:22-139.178.68.195:46002 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:23.487196 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:23.487273 kernel: audit: type=1130 audit(1707813503.459:1574): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.67.89:22-139.178.68.195:46002 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:23.605000 audit[4811]: USER_ACCT pid=4811 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.606392 sshd[4811]: Accepted publickey for core from 139.178.68.195 port 46002 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:23.608396 sshd[4811]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:23.610800 systemd-logind[1463]: New session 43 of user core. Feb 13 08:38:23.611243 systemd[1]: Started session-43.scope. Feb 13 08:38:23.689698 sshd[4811]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:23.691153 systemd[1]: sshd@57-145.40.67.89:22-139.178.68.195:46002.service: Deactivated successfully. Feb 13 08:38:23.691578 systemd[1]: session-43.scope: Deactivated successfully. Feb 13 08:38:23.691873 systemd-logind[1463]: Session 43 logged out. Waiting for processes to exit. Feb 13 08:38:23.692392 systemd-logind[1463]: Removed session 43. Feb 13 08:38:23.607000 audit[4811]: CRED_ACQ pid=4811 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.788187 kernel: audit: type=1101 audit(1707813503.605:1575): pid=4811 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.788220 kernel: audit: type=1103 audit(1707813503.607:1576): pid=4811 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.788239 kernel: audit: type=1006 audit(1707813503.607:1577): pid=4811 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Feb 13 08:38:23.846771 kernel: audit: type=1300 audit(1707813503.607:1577): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe09a34cd0 a2=3 a3=0 items=0 ppid=1 pid=4811 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:23.607000 audit[4811]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe09a34cd0 a2=3 a3=0 items=0 ppid=1 pid=4811 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:23.938746 kernel: audit: type=1327 audit(1707813503.607:1577): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:23.607000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:23.969232 kernel: audit: type=1105 audit(1707813503.612:1578): pid=4811 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.612000 audit[4811]: USER_START pid=4811 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:24.064318 kernel: audit: type=1103 audit(1707813503.612:1579): pid=4815 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.612000 audit[4815]: CRED_ACQ pid=4815 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:24.153797 kernel: audit: type=1106 audit(1707813503.689:1580): pid=4811 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.689000 audit[4811]: USER_END pid=4811 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:24.249305 kernel: audit: type=1104 audit(1707813503.689:1581): pid=4811 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.689000 audit[4811]: CRED_DISP pid=4811 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:23.690000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-145.40.67.89:22-139.178.68.195:46002 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:25.264144 kubelet[2614]: E0213 08:38:25.264094 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:25.423306 kubelet[2614]: E0213 08:38:25.423236 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:27.264831 kubelet[2614]: E0213 08:38:27.264786 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:28.699479 systemd[1]: Started sshd@58-145.40.67.89:22-139.178.68.195:60440.service. Feb 13 08:38:28.698000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.67.89:22-139.178.68.195:60440 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:28.726617 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:28.726683 kernel: audit: type=1130 audit(1707813508.698:1583): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.67.89:22-139.178.68.195:60440 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:28.844000 audit[4839]: USER_ACCT pid=4839 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.845663 sshd[4839]: Accepted publickey for core from 139.178.68.195 port 60440 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:28.848659 sshd[4839]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:28.854337 systemd-logind[1463]: New session 44 of user core. Feb 13 08:38:28.856333 systemd[1]: Started session-44.scope. Feb 13 08:38:28.935218 sshd[4839]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:28.936678 systemd[1]: sshd@58-145.40.67.89:22-139.178.68.195:60440.service: Deactivated successfully. Feb 13 08:38:28.847000 audit[4839]: CRED_ACQ pid=4839 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.937139 systemd[1]: session-44.scope: Deactivated successfully. Feb 13 08:38:28.937515 systemd-logind[1463]: Session 44 logged out. Waiting for processes to exit. Feb 13 08:38:28.937910 systemd-logind[1463]: Removed session 44. Feb 13 08:38:29.027399 kernel: audit: type=1101 audit(1707813508.844:1584): pid=4839 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:29.027436 kernel: audit: type=1103 audit(1707813508.847:1585): pid=4839 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:29.027453 kernel: audit: type=1006 audit(1707813508.847:1586): pid=4839 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Feb 13 08:38:28.847000 audit[4839]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5628a530 a2=3 a3=0 items=0 ppid=1 pid=4839 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:29.178073 kernel: audit: type=1300 audit(1707813508.847:1586): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5628a530 a2=3 a3=0 items=0 ppid=1 pid=4839 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:29.178106 kernel: audit: type=1327 audit(1707813508.847:1586): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:28.847000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:29.208601 kernel: audit: type=1105 audit(1707813508.858:1587): pid=4839 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.858000 audit[4839]: USER_START pid=4839 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:29.264255 kubelet[2614]: E0213 08:38:29.264215 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:29.303161 kernel: audit: type=1103 audit(1707813508.859:1588): pid=4841 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.859000 audit[4841]: CRED_ACQ pid=4841 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.935000 audit[4839]: USER_END pid=4839 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:29.488012 kernel: audit: type=1106 audit(1707813508.935:1589): pid=4839 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:29.488050 kernel: audit: type=1104 audit(1707813508.935:1590): pid=4839 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.935000 audit[4839]: CRED_DISP pid=4839 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:28.935000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-145.40.67.89:22-139.178.68.195:60440 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:30.424837 kubelet[2614]: E0213 08:38:30.424731 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:31.265053 kubelet[2614]: E0213 08:38:31.265001 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:33.264892 kubelet[2614]: E0213 08:38:33.264875 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:33.945072 systemd[1]: Started sshd@59-145.40.67.89:22-139.178.68.195:60456.service. Feb 13 08:38:33.944000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.67.89:22-139.178.68.195:60456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:33.971953 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:33.972026 kernel: audit: type=1130 audit(1707813513.944:1592): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.67.89:22-139.178.68.195:60456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:34.091000 audit[4864]: USER_ACCT pid=4864 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.092271 sshd[4864]: Accepted publickey for core from 139.178.68.195 port 60456 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:34.093261 sshd[4864]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:34.095756 systemd-logind[1463]: New session 45 of user core. Feb 13 08:38:34.096499 systemd[1]: Started session-45.scope. Feb 13 08:38:34.176068 sshd[4864]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:34.178124 systemd[1]: sshd@59-145.40.67.89:22-139.178.68.195:60456.service: Deactivated successfully. Feb 13 08:38:34.178530 systemd[1]: session-45.scope: Deactivated successfully. Feb 13 08:38:34.178890 systemd-logind[1463]: Session 45 logged out. Waiting for processes to exit. Feb 13 08:38:34.179492 systemd[1]: Started sshd@60-145.40.67.89:22-139.178.68.195:60468.service. Feb 13 08:38:34.179964 systemd-logind[1463]: Removed session 45. Feb 13 08:38:34.092000 audit[4864]: CRED_ACQ pid=4864 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.274209 kernel: audit: type=1101 audit(1707813514.091:1593): pid=4864 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.274250 kernel: audit: type=1103 audit(1707813514.092:1594): pid=4864 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.274264 kernel: audit: type=1006 audit(1707813514.092:1595): pid=4864 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=45 res=1 Feb 13 08:38:34.332797 kernel: audit: type=1300 audit(1707813514.092:1595): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6ebf1c80 a2=3 a3=0 items=0 ppid=1 pid=4864 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:34.092000 audit[4864]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6ebf1c80 a2=3 a3=0 items=0 ppid=1 pid=4864 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:34.361250 sshd[4889]: Accepted publickey for core from 139.178.68.195 port 60468 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:34.363552 sshd[4889]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:34.365932 systemd-logind[1463]: New session 46 of user core. Feb 13 08:38:34.366667 systemd[1]: Started session-46.scope. Feb 13 08:38:34.092000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:34.424969 kernel: audit: type=1327 audit(1707813514.092:1595): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:34.097000 audit[4864]: USER_START pid=4864 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.550071 kernel: audit: type=1105 audit(1707813514.097:1596): pid=4864 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.550148 kernel: audit: type=1103 audit(1707813514.098:1597): pid=4866 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.098000 audit[4866]: CRED_ACQ pid=4866 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.175000 audit[4864]: USER_END pid=4864 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.735001 kernel: audit: type=1106 audit(1707813514.175:1598): pid=4864 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.735056 kernel: audit: type=1104 audit(1707813514.175:1599): pid=4864 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.175000 audit[4864]: CRED_DISP pid=4864 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.177000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-145.40.67.89:22-139.178.68.195:60456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:34.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.67.89:22-139.178.68.195:60468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:34.360000 audit[4889]: USER_ACCT pid=4889 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.362000 audit[4889]: CRED_ACQ pid=4889 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.362000 audit[4889]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeaa8b6bf0 a2=3 a3=0 items=0 ppid=1 pid=4889 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:34.362000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:34.367000 audit[4889]: USER_START pid=4889 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.368000 audit[4891]: CRED_ACQ pid=4891 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.918388 sshd[4889]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:34.918000 audit[4889]: USER_END pid=4889 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.918000 audit[4889]: CRED_DISP pid=4889 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.920145 systemd[1]: sshd@60-145.40.67.89:22-139.178.68.195:60468.service: Deactivated successfully. Feb 13 08:38:34.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-145.40.67.89:22-139.178.68.195:60468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:34.920553 systemd[1]: session-46.scope: Deactivated successfully. Feb 13 08:38:34.920878 systemd-logind[1463]: Session 46 logged out. Waiting for processes to exit. Feb 13 08:38:34.921543 systemd[1]: Started sshd@61-145.40.67.89:22-139.178.68.195:60470.service. Feb 13 08:38:34.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.67.89:22-139.178.68.195:60470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:34.922072 systemd-logind[1463]: Removed session 46. Feb 13 08:38:34.955000 audit[4913]: USER_ACCT pid=4913 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.956786 sshd[4913]: Accepted publickey for core from 139.178.68.195 port 60470 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:34.958000 audit[4913]: CRED_ACQ pid=4913 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.958000 audit[4913]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcbe0f630 a2=3 a3=0 items=0 ppid=1 pid=4913 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:34.958000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:34.959978 sshd[4913]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:34.970043 systemd-logind[1463]: New session 47 of user core. Feb 13 08:38:34.972575 systemd[1]: Started session-47.scope. Feb 13 08:38:34.986000 audit[4913]: USER_START pid=4913 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:34.989000 audit[4915]: CRED_ACQ pid=4915 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:35.154629 sshd[4913]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:35.155000 audit[4913]: USER_END pid=4913 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:35.155000 audit[4913]: CRED_DISP pid=4913 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:35.158299 systemd[1]: sshd@61-145.40.67.89:22-139.178.68.195:60470.service: Deactivated successfully. Feb 13 08:38:35.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-145.40.67.89:22-139.178.68.195:60470 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:35.159457 systemd[1]: session-47.scope: Deactivated successfully. Feb 13 08:38:35.160507 systemd-logind[1463]: Session 47 logged out. Waiting for processes to exit. Feb 13 08:38:35.161851 systemd-logind[1463]: Removed session 47. Feb 13 08:38:35.265304 kubelet[2614]: E0213 08:38:35.265189 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:35.426278 kubelet[2614]: E0213 08:38:35.426221 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:35.609000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.609000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0002fee00 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:38:35.609000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:38:35.609000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.609000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002448d50 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:38:35.609000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:38:35.892000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.892000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.892000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c01166b560 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:38:35.892000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c01197b0e0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:38:35.892000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:38:35.892000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:38:35.892000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.892000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0079c5cb0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:38:35.892000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:38:35.892000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.892000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0002badc0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:38:35.892000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:38:35.892000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.892000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00539c5d0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:38:35.892000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:38:35.892000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:35.892000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c00380e620 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:38:35.892000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:38:37.264245 kubelet[2614]: E0213 08:38:37.264198 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:39.265258 kubelet[2614]: E0213 08:38:39.265216 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:40.164129 systemd[1]: Started sshd@62-145.40.67.89:22-139.178.68.195:40322.service. Feb 13 08:38:40.163000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.67.89:22-139.178.68.195:40322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:40.191283 kernel: kauditd_printk_skb: 47 callbacks suppressed Feb 13 08:38:40.191351 kernel: audit: type=1130 audit(1707813520.163:1627): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.67.89:22-139.178.68.195:40322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:40.310000 audit[4939]: USER_ACCT pid=4939 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.311425 sshd[4939]: Accepted publickey for core from 139.178.68.195 port 40322 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:40.313146 sshd[4939]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:40.315914 systemd-logind[1463]: New session 48 of user core. Feb 13 08:38:40.316972 systemd[1]: Started session-48.scope. Feb 13 08:38:40.395707 sshd[4939]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:40.397591 systemd[1]: sshd@62-145.40.67.89:22-139.178.68.195:40322.service: Deactivated successfully. Feb 13 08:38:40.398021 systemd[1]: session-48.scope: Deactivated successfully. Feb 13 08:38:40.398386 systemd-logind[1463]: Session 48 logged out. Waiting for processes to exit. Feb 13 08:38:40.398860 systemd-logind[1463]: Removed session 48. Feb 13 08:38:40.312000 audit[4939]: CRED_ACQ pid=4939 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.426799 kubelet[2614]: E0213 08:38:40.426759 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:40.493317 kernel: audit: type=1101 audit(1707813520.310:1628): pid=4939 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.493344 kernel: audit: type=1103 audit(1707813520.312:1629): pid=4939 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.493357 kernel: audit: type=1006 audit(1707813520.312:1630): pid=4939 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Feb 13 08:38:40.552063 kernel: audit: type=1300 audit(1707813520.312:1630): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8eba1710 a2=3 a3=0 items=0 ppid=1 pid=4939 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:40.312000 audit[4939]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8eba1710 a2=3 a3=0 items=0 ppid=1 pid=4939 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:40.644118 kernel: audit: type=1327 audit(1707813520.312:1630): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:40.312000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:40.674666 kernel: audit: type=1105 audit(1707813520.320:1631): pid=4939 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.320000 audit[4939]: USER_START pid=4939 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.769158 kernel: audit: type=1103 audit(1707813520.320:1632): pid=4943 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.320000 audit[4943]: CRED_ACQ pid=4943 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.858455 kernel: audit: type=1106 audit(1707813520.395:1633): pid=4939 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.395000 audit[4939]: USER_END pid=4939 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.954015 kernel: audit: type=1104 audit(1707813520.395:1634): pid=4939 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.395000 audit[4939]: CRED_DISP pid=4939 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:40.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-145.40.67.89:22-139.178.68.195:40322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:40.949000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:40.949000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000899d40 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:38:40.949000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:38:40.949000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:40.949000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000bfd7e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:38:40.949000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:38:40.950000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:40.950000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002355c80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:38:40.950000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:38:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:38:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000899f60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:38:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:38:41.264908 kubelet[2614]: E0213 08:38:41.264859 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:42.320688 kubelet[2614]: E0213 08:38:42.320586 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.320688 kubelet[2614]: W0213 08:38:42.320629 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.320688 kubelet[2614]: E0213 08:38:42.320672 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.321684 kubelet[2614]: E0213 08:38:42.321149 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.321684 kubelet[2614]: W0213 08:38:42.321180 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.321684 kubelet[2614]: E0213 08:38:42.321215 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.321983 kubelet[2614]: E0213 08:38:42.321688 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.321983 kubelet[2614]: W0213 08:38:42.321719 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.321983 kubelet[2614]: E0213 08:38:42.321755 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.322408 kubelet[2614]: E0213 08:38:42.322332 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.322408 kubelet[2614]: W0213 08:38:42.322364 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.322408 kubelet[2614]: E0213 08:38:42.322400 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.322972 kubelet[2614]: E0213 08:38:42.322916 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.322972 kubelet[2614]: W0213 08:38:42.322969 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.323212 kubelet[2614]: E0213 08:38:42.323008 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.323590 kubelet[2614]: E0213 08:38:42.323516 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.323590 kubelet[2614]: W0213 08:38:42.323548 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.323590 kubelet[2614]: E0213 08:38:42.323585 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.324133 kubelet[2614]: E0213 08:38:42.324060 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.324133 kubelet[2614]: W0213 08:38:42.324085 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.324133 kubelet[2614]: E0213 08:38:42.324122 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.324647 kubelet[2614]: E0213 08:38:42.324590 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.324647 kubelet[2614]: W0213 08:38:42.324622 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.324873 kubelet[2614]: E0213 08:38:42.324661 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.325157 kubelet[2614]: E0213 08:38:42.325075 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.325157 kubelet[2614]: W0213 08:38:42.325104 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.325157 kubelet[2614]: E0213 08:38:42.325136 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.325714 kubelet[2614]: E0213 08:38:42.325683 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.325827 kubelet[2614]: W0213 08:38:42.325715 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.325827 kubelet[2614]: E0213 08:38:42.325751 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.326322 kubelet[2614]: E0213 08:38:42.326247 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.326322 kubelet[2614]: W0213 08:38:42.326278 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.326322 kubelet[2614]: E0213 08:38:42.326315 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.326761 kubelet[2614]: E0213 08:38:42.326735 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.326761 kubelet[2614]: W0213 08:38:42.326760 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.326993 kubelet[2614]: E0213 08:38:42.326793 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.327422 kubelet[2614]: E0213 08:38:42.327346 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.327422 kubelet[2614]: W0213 08:38:42.327378 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.327422 kubelet[2614]: E0213 08:38:42.327414 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.327873 kubelet[2614]: E0213 08:38:42.327847 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.328015 kubelet[2614]: W0213 08:38:42.327872 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.328015 kubelet[2614]: E0213 08:38:42.327905 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.328444 kubelet[2614]: E0213 08:38:42.328408 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.328591 kubelet[2614]: W0213 08:38:42.328444 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.328591 kubelet[2614]: E0213 08:38:42.328482 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.328905 kubelet[2614]: E0213 08:38:42.328878 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.328905 kubelet[2614]: W0213 08:38:42.328902 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.329159 kubelet[2614]: E0213 08:38:42.328959 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.329447 kubelet[2614]: E0213 08:38:42.329417 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.329548 kubelet[2614]: W0213 08:38:42.329449 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.329548 kubelet[2614]: E0213 08:38:42.329491 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.329948 kubelet[2614]: E0213 08:38:42.329910 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.330077 kubelet[2614]: W0213 08:38:42.329960 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.330077 kubelet[2614]: E0213 08:38:42.330001 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.330420 kubelet[2614]: E0213 08:38:42.330396 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.330519 kubelet[2614]: W0213 08:38:42.330419 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.330519 kubelet[2614]: E0213 08:38:42.330448 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:42.330848 kubelet[2614]: E0213 08:38:42.330824 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:42.330992 kubelet[2614]: W0213 08:38:42.330848 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:42.330992 kubelet[2614]: E0213 08:38:42.330877 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:43.264378 kubelet[2614]: E0213 08:38:43.264330 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:45.265231 kubelet[2614]: E0213 08:38:45.265185 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:45.405177 systemd[1]: Started sshd@63-145.40.67.89:22-139.178.68.195:40328.service. Feb 13 08:38:45.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.67.89:22-139.178.68.195:40328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:45.427997 kubelet[2614]: E0213 08:38:45.427968 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:45.432348 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:38:45.432411 kernel: audit: type=1130 audit(1707813525.404:1640): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.67.89:22-139.178.68.195:40328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:45.549000 audit[4985]: USER_ACCT pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.550922 sshd[4985]: Accepted publickey for core from 139.178.68.195 port 40328 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:45.552234 sshd[4985]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:45.554462 systemd-logind[1463]: New session 49 of user core. Feb 13 08:38:45.554951 systemd[1]: Started session-49.scope. Feb 13 08:38:45.631329 sshd[4985]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:45.632622 systemd[1]: sshd@63-145.40.67.89:22-139.178.68.195:40328.service: Deactivated successfully. Feb 13 08:38:45.633077 systemd[1]: session-49.scope: Deactivated successfully. Feb 13 08:38:45.633414 systemd-logind[1463]: Session 49 logged out. Waiting for processes to exit. Feb 13 08:38:45.633785 systemd-logind[1463]: Removed session 49. Feb 13 08:38:45.551000 audit[4985]: CRED_ACQ pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.732912 kernel: audit: type=1101 audit(1707813525.549:1641): pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.732948 kernel: audit: type=1103 audit(1707813525.551:1642): pid=4985 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.732965 kernel: audit: type=1006 audit(1707813525.551:1643): pid=4985 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Feb 13 08:38:45.791549 kernel: audit: type=1300 audit(1707813525.551:1643): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdc15a6860 a2=3 a3=0 items=0 ppid=1 pid=4985 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:45.551000 audit[4985]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdc15a6860 a2=3 a3=0 items=0 ppid=1 pid=4985 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:45.551000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:45.914217 kernel: audit: type=1327 audit(1707813525.551:1643): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:45.914244 kernel: audit: type=1105 audit(1707813525.556:1644): pid=4985 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.556000 audit[4985]: USER_START pid=4985 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:46.008809 kernel: audit: type=1103 audit(1707813525.556:1645): pid=4987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.556000 audit[4987]: CRED_ACQ pid=4987 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:46.098116 kernel: audit: type=1106 audit(1707813525.631:1646): pid=4985 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.631000 audit[4985]: USER_END pid=4985 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:46.193798 kernel: audit: type=1104 audit(1707813525.631:1647): pid=4985 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.631000 audit[4985]: CRED_DISP pid=4985 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:45.631000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-145.40.67.89:22-139.178.68.195:40328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:47.264599 kubelet[2614]: E0213 08:38:47.264554 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:49.265084 kubelet[2614]: E0213 08:38:49.265037 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:50.429205 kubelet[2614]: E0213 08:38:50.429110 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:50.641237 systemd[1]: Started sshd@64-145.40.67.89:22-139.178.68.195:38110.service. Feb 13 08:38:50.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.67.89:22-139.178.68.195:38110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:50.668285 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:50.668325 kernel: audit: type=1130 audit(1707813530.640:1649): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.67.89:22-139.178.68.195:38110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:50.786000 audit[5009]: USER_ACCT pid=5009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.787364 sshd[5009]: Accepted publickey for core from 139.178.68.195 port 38110 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:50.789234 sshd[5009]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:50.791537 systemd-logind[1463]: New session 50 of user core. Feb 13 08:38:50.792219 systemd[1]: Started session-50.scope. Feb 13 08:38:50.870641 sshd[5009]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:50.872119 systemd[1]: sshd@64-145.40.67.89:22-139.178.68.195:38110.service: Deactivated successfully. Feb 13 08:38:50.872544 systemd[1]: session-50.scope: Deactivated successfully. Feb 13 08:38:50.872863 systemd-logind[1463]: Session 50 logged out. Waiting for processes to exit. Feb 13 08:38:50.873453 systemd-logind[1463]: Removed session 50. Feb 13 08:38:50.788000 audit[5009]: CRED_ACQ pid=5009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.969274 kernel: audit: type=1101 audit(1707813530.786:1650): pid=5009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.969308 kernel: audit: type=1103 audit(1707813530.788:1651): pid=5009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.969326 kernel: audit: type=1006 audit(1707813530.788:1652): pid=5009 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Feb 13 08:38:51.027880 kernel: audit: type=1300 audit(1707813530.788:1652): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc575a3f50 a2=3 a3=0 items=0 ppid=1 pid=5009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:50.788000 audit[5009]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc575a3f50 a2=3 a3=0 items=0 ppid=1 pid=5009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:51.119907 kernel: audit: type=1327 audit(1707813530.788:1652): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:50.788000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:51.150433 kernel: audit: type=1105 audit(1707813530.793:1653): pid=5009 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.793000 audit[5009]: USER_START pid=5009 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:51.244959 kernel: audit: type=1103 audit(1707813530.794:1654): pid=5011 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.794000 audit[5011]: CRED_ACQ pid=5011 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:51.264685 kubelet[2614]: E0213 08:38:51.264646 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:51.334260 kernel: audit: type=1106 audit(1707813530.870:1655): pid=5009 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.870000 audit[5009]: USER_END pid=5009 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.870000 audit[5009]: CRED_DISP pid=5009 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:51.429930 kernel: audit: type=1104 audit(1707813530.870:1656): pid=5009 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:50.871000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-145.40.67.89:22-139.178.68.195:38110 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:53.264581 kubelet[2614]: E0213 08:38:53.264536 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:55.265323 kubelet[2614]: E0213 08:38:55.265275 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:55.329068 kubelet[2614]: E0213 08:38:55.328956 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.329068 kubelet[2614]: W0213 08:38:55.329000 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.329068 kubelet[2614]: E0213 08:38:55.329047 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.329624 kubelet[2614]: E0213 08:38:55.329591 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.329755 kubelet[2614]: W0213 08:38:55.329626 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.329755 kubelet[2614]: E0213 08:38:55.329666 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.330320 kubelet[2614]: E0213 08:38:55.330232 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.330320 kubelet[2614]: W0213 08:38:55.330265 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.330320 kubelet[2614]: E0213 08:38:55.330303 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.330915 kubelet[2614]: E0213 08:38:55.330878 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.330915 kubelet[2614]: W0213 08:38:55.330913 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.331200 kubelet[2614]: E0213 08:38:55.330968 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.331573 kubelet[2614]: E0213 08:38:55.331484 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.331573 kubelet[2614]: W0213 08:38:55.331518 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.331573 kubelet[2614]: E0213 08:38:55.331557 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.332146 kubelet[2614]: E0213 08:38:55.332061 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.332146 kubelet[2614]: W0213 08:38:55.332087 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.332146 kubelet[2614]: E0213 08:38:55.332121 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.332757 kubelet[2614]: E0213 08:38:55.332687 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.332757 kubelet[2614]: W0213 08:38:55.332721 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.333029 kubelet[2614]: E0213 08:38:55.332763 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.333405 kubelet[2614]: E0213 08:38:55.333315 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.333405 kubelet[2614]: W0213 08:38:55.333349 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.333405 kubelet[2614]: E0213 08:38:55.333388 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.333906 kubelet[2614]: E0213 08:38:55.333878 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.333906 kubelet[2614]: W0213 08:38:55.333905 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.334170 kubelet[2614]: E0213 08:38:55.333963 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.334565 kubelet[2614]: E0213 08:38:55.334477 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.334565 kubelet[2614]: W0213 08:38:55.334510 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.334565 kubelet[2614]: E0213 08:38:55.334549 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.335055 kubelet[2614]: E0213 08:38:55.335012 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.335055 kubelet[2614]: W0213 08:38:55.335037 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.335316 kubelet[2614]: E0213 08:38:55.335070 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.335591 kubelet[2614]: E0213 08:38:55.335524 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:38:55.335591 kubelet[2614]: W0213 08:38:55.335549 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:38:55.335591 kubelet[2614]: E0213 08:38:55.335582 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:38:55.431095 kubelet[2614]: E0213 08:38:55.430989 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:38:55.879313 systemd[1]: Started sshd@65-145.40.67.89:22-139.178.68.195:38120.service. Feb 13 08:38:55.878000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.67.89:22-139.178.68.195:38120 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:55.906016 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:38:55.906068 kernel: audit: type=1130 audit(1707813535.878:1658): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.67.89:22-139.178.68.195:38120 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:56.023000 audit[5048]: USER_ACCT pid=5048 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.024363 sshd[5048]: Accepted publickey for core from 139.178.68.195 port 38120 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:38:56.025972 sshd[5048]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:38:56.028282 systemd-logind[1463]: New session 51 of user core. Feb 13 08:38:56.028729 systemd[1]: Started session-51.scope. Feb 13 08:38:56.107983 sshd[5048]: pam_unix(sshd:session): session closed for user core Feb 13 08:38:56.109448 systemd[1]: sshd@65-145.40.67.89:22-139.178.68.195:38120.service: Deactivated successfully. Feb 13 08:38:56.109873 systemd[1]: session-51.scope: Deactivated successfully. Feb 13 08:38:56.110264 systemd-logind[1463]: Session 51 logged out. Waiting for processes to exit. Feb 13 08:38:56.110749 systemd-logind[1463]: Removed session 51. Feb 13 08:38:56.024000 audit[5048]: CRED_ACQ pid=5048 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.207710 kernel: audit: type=1101 audit(1707813536.023:1659): pid=5048 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.207745 kernel: audit: type=1103 audit(1707813536.024:1660): pid=5048 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.207761 kernel: audit: type=1006 audit(1707813536.024:1661): pid=5048 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Feb 13 08:38:56.024000 audit[5048]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6d0d75f0 a2=3 a3=0 items=0 ppid=1 pid=5048 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:56.358436 kernel: audit: type=1300 audit(1707813536.024:1661): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6d0d75f0 a2=3 a3=0 items=0 ppid=1 pid=5048 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:38:56.358461 kernel: audit: type=1327 audit(1707813536.024:1661): proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:56.024000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:38:56.388977 kernel: audit: type=1105 audit(1707813536.029:1662): pid=5048 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.029000 audit[5048]: USER_START pid=5048 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.483478 kernel: audit: type=1103 audit(1707813536.030:1663): pid=5050 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.030000 audit[5050]: CRED_ACQ pid=5050 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.107000 audit[5048]: USER_END pid=5048 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.668425 kernel: audit: type=1106 audit(1707813536.107:1664): pid=5048 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.668455 kernel: audit: type=1104 audit(1707813536.107:1665): pid=5048 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.107000 audit[5048]: CRED_DISP pid=5048 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:38:56.108000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-145.40.67.89:22-139.178.68.195:38120 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:38:57.264458 kubelet[2614]: E0213 08:38:57.264414 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:38:59.264273 kubelet[2614]: E0213 08:38:59.264227 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:00.432769 kubelet[2614]: E0213 08:39:00.432666 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:01.117709 systemd[1]: Started sshd@66-145.40.67.89:22-139.178.68.195:34186.service. Feb 13 08:39:01.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.67.89:22-139.178.68.195:34186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:01.144639 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:39:01.144666 kernel: audit: type=1130 audit(1707813541.116:1667): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.67.89:22-139.178.68.195:34186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:01.261000 audit[5072]: USER_ACCT pid=5072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.262422 sshd[5072]: Accepted publickey for core from 139.178.68.195 port 34186 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:01.264258 sshd[5072]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:01.264715 kubelet[2614]: E0213 08:39:01.264697 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:01.266587 systemd-logind[1463]: New session 52 of user core. Feb 13 08:39:01.267047 systemd[1]: Started session-52.scope. Feb 13 08:39:01.343715 sshd[5072]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:01.345148 systemd[1]: sshd@66-145.40.67.89:22-139.178.68.195:34186.service: Deactivated successfully. Feb 13 08:39:01.345574 systemd[1]: session-52.scope: Deactivated successfully. Feb 13 08:39:01.345854 systemd-logind[1463]: Session 52 logged out. Waiting for processes to exit. Feb 13 08:39:01.346381 systemd-logind[1463]: Removed session 52. Feb 13 08:39:01.263000 audit[5072]: CRED_ACQ pid=5072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.444262 kernel: audit: type=1101 audit(1707813541.261:1668): pid=5072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.444303 kernel: audit: type=1103 audit(1707813541.263:1669): pid=5072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.444321 kernel: audit: type=1006 audit(1707813541.263:1670): pid=5072 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Feb 13 08:39:01.502931 kernel: audit: type=1300 audit(1707813541.263:1670): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd6232ab0 a2=3 a3=0 items=0 ppid=1 pid=5072 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:01.263000 audit[5072]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd6232ab0 a2=3 a3=0 items=0 ppid=1 pid=5072 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:01.595017 kernel: audit: type=1327 audit(1707813541.263:1670): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:01.263000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:01.268000 audit[5072]: USER_START pid=5072 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.719972 kernel: audit: type=1105 audit(1707813541.268:1671): pid=5072 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.720063 kernel: audit: type=1103 audit(1707813541.268:1672): pid=5075 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.268000 audit[5075]: CRED_ACQ pid=5075 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.809288 kernel: audit: type=1106 audit(1707813541.343:1673): pid=5072 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.343000 audit[5072]: USER_END pid=5072 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.904858 kernel: audit: type=1104 audit(1707813541.343:1674): pid=5072 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.343000 audit[5072]: CRED_DISP pid=5072 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:01.344000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-145.40.67.89:22-139.178.68.195:34186 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:02.288924 kubelet[2614]: E0213 08:39:02.288852 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.288924 kubelet[2614]: W0213 08:39:02.288896 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.289949 kubelet[2614]: E0213 08:39:02.288971 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.289949 kubelet[2614]: E0213 08:39:02.289433 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.289949 kubelet[2614]: W0213 08:39:02.289464 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.289949 kubelet[2614]: E0213 08:39:02.289500 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.290393 kubelet[2614]: E0213 08:39:02.290017 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.290393 kubelet[2614]: W0213 08:39:02.290050 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.290393 kubelet[2614]: E0213 08:39:02.290087 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.290693 kubelet[2614]: E0213 08:39:02.290647 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.290693 kubelet[2614]: W0213 08:39:02.290684 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.290937 kubelet[2614]: E0213 08:39:02.290720 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.291252 kubelet[2614]: E0213 08:39:02.291205 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.291252 kubelet[2614]: W0213 08:39:02.291238 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.291584 kubelet[2614]: E0213 08:39:02.291275 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.291799 kubelet[2614]: E0213 08:39:02.291735 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.291799 kubelet[2614]: W0213 08:39:02.291768 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.291799 kubelet[2614]: E0213 08:39:02.291805 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.292380 kubelet[2614]: E0213 08:39:02.292338 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.292380 kubelet[2614]: W0213 08:39:02.292370 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.292717 kubelet[2614]: E0213 08:39:02.292407 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.292878 kubelet[2614]: E0213 08:39:02.292847 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.293028 kubelet[2614]: W0213 08:39:02.292882 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.293028 kubelet[2614]: E0213 08:39:02.292939 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.332807 kubelet[2614]: E0213 08:39:02.332744 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.332807 kubelet[2614]: W0213 08:39:02.332789 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.333369 kubelet[2614]: E0213 08:39:02.332854 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.333616 kubelet[2614]: E0213 08:39:02.333578 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.333616 kubelet[2614]: W0213 08:39:02.333609 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.333973 kubelet[2614]: E0213 08:39:02.333660 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.334232 kubelet[2614]: E0213 08:39:02.334194 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.334232 kubelet[2614]: W0213 08:39:02.334224 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.334586 kubelet[2614]: E0213 08:39:02.334273 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.334796 kubelet[2614]: E0213 08:39:02.334758 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.334796 kubelet[2614]: W0213 08:39:02.334788 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.335184 kubelet[2614]: E0213 08:39:02.334835 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.335442 kubelet[2614]: E0213 08:39:02.335405 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.335442 kubelet[2614]: W0213 08:39:02.335435 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.335804 kubelet[2614]: E0213 08:39:02.335481 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:02.336530 kubelet[2614]: E0213 08:39:02.336484 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:02.336530 kubelet[2614]: W0213 08:39:02.336523 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:02.336836 kubelet[2614]: E0213 08:39:02.336576 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:03.265287 kubelet[2614]: E0213 08:39:03.265186 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:05.265300 kubelet[2614]: E0213 08:39:05.265141 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:05.434242 kubelet[2614]: E0213 08:39:05.434178 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:06.347414 systemd[1]: Started sshd@67-145.40.67.89:22-139.178.68.195:44438.service. Feb 13 08:39:06.346000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.67.89:22-139.178.68.195:44438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:06.374215 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:39:06.374273 kernel: audit: type=1130 audit(1707813546.346:1676): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.67.89:22-139.178.68.195:44438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:06.490000 audit[5112]: USER_ACCT pid=5112 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.491969 sshd[5112]: Accepted publickey for core from 139.178.68.195 port 44438 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:06.493241 sshd[5112]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:06.495612 systemd-logind[1463]: New session 53 of user core. Feb 13 08:39:06.496099 systemd[1]: Started session-53.scope. Feb 13 08:39:06.577330 sshd[5112]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:06.578778 systemd[1]: sshd@67-145.40.67.89:22-139.178.68.195:44438.service: Deactivated successfully. Feb 13 08:39:06.579232 systemd[1]: session-53.scope: Deactivated successfully. Feb 13 08:39:06.579597 systemd-logind[1463]: Session 53 logged out. Waiting for processes to exit. Feb 13 08:39:06.580012 systemd-logind[1463]: Removed session 53. Feb 13 08:39:06.492000 audit[5112]: CRED_ACQ pid=5112 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.676501 kernel: audit: type=1101 audit(1707813546.490:1677): pid=5112 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.676545 kernel: audit: type=1103 audit(1707813546.492:1678): pid=5112 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.676558 kernel: audit: type=1006 audit(1707813546.492:1679): pid=5112 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Feb 13 08:39:06.735135 kernel: audit: type=1300 audit(1707813546.492:1679): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffefa12f450 a2=3 a3=0 items=0 ppid=1 pid=5112 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:06.492000 audit[5112]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffefa12f450 a2=3 a3=0 items=0 ppid=1 pid=5112 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:06.827139 kernel: audit: type=1327 audit(1707813546.492:1679): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:06.492000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:06.857641 kernel: audit: type=1105 audit(1707813546.497:1680): pid=5112 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.497000 audit[5112]: USER_START pid=5112 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.952093 kernel: audit: type=1103 audit(1707813546.498:1681): pid=5114 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.498000 audit[5114]: CRED_ACQ pid=5114 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:07.041316 kernel: audit: type=1106 audit(1707813546.577:1682): pid=5112 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.577000 audit[5112]: USER_END pid=5112 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:07.136847 kernel: audit: type=1104 audit(1707813546.577:1683): pid=5112 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.577000 audit[5112]: CRED_DISP pid=5112 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:06.577000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-145.40.67.89:22-139.178.68.195:44438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:07.265123 kubelet[2614]: E0213 08:39:07.265109 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:09.264295 kubelet[2614]: E0213 08:39:09.264248 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:10.349314 kubelet[2614]: E0213 08:39:10.349203 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.349314 kubelet[2614]: W0213 08:39:10.349248 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.349314 kubelet[2614]: E0213 08:39:10.349293 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.350744 kubelet[2614]: E0213 08:39:10.349793 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.350744 kubelet[2614]: W0213 08:39:10.349827 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.350744 kubelet[2614]: E0213 08:39:10.349868 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.350744 kubelet[2614]: E0213 08:39:10.350445 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.350744 kubelet[2614]: W0213 08:39:10.350478 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.350744 kubelet[2614]: E0213 08:39:10.350515 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.351518 kubelet[2614]: E0213 08:39:10.351140 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.351518 kubelet[2614]: W0213 08:39:10.351173 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.351518 kubelet[2614]: E0213 08:39:10.351211 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.351989 kubelet[2614]: E0213 08:39:10.351722 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.351989 kubelet[2614]: W0213 08:39:10.351754 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.351989 kubelet[2614]: E0213 08:39:10.351793 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.352363 kubelet[2614]: E0213 08:39:10.352311 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.352363 kubelet[2614]: W0213 08:39:10.352343 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.352584 kubelet[2614]: E0213 08:39:10.352381 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.352992 kubelet[2614]: E0213 08:39:10.352945 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.352992 kubelet[2614]: W0213 08:39:10.352979 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.353250 kubelet[2614]: E0213 08:39:10.353015 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.353585 kubelet[2614]: E0213 08:39:10.353496 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.353585 kubelet[2614]: W0213 08:39:10.353529 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.353585 kubelet[2614]: E0213 08:39:10.353572 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.354082 kubelet[2614]: E0213 08:39:10.354061 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.354214 kubelet[2614]: W0213 08:39:10.354087 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.354214 kubelet[2614]: E0213 08:39:10.354123 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.354801 kubelet[2614]: E0213 08:39:10.354710 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.354801 kubelet[2614]: W0213 08:39:10.354747 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.354801 kubelet[2614]: E0213 08:39:10.354788 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.355429 kubelet[2614]: E0213 08:39:10.355336 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.355429 kubelet[2614]: W0213 08:39:10.355370 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.355429 kubelet[2614]: E0213 08:39:10.355410 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.355915 kubelet[2614]: E0213 08:39:10.355885 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.355915 kubelet[2614]: W0213 08:39:10.355912 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.356212 kubelet[2614]: E0213 08:39:10.355969 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.356571 kubelet[2614]: E0213 08:39:10.356481 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.356571 kubelet[2614]: W0213 08:39:10.356515 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.356571 kubelet[2614]: E0213 08:39:10.356558 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.357100 kubelet[2614]: E0213 08:39:10.357050 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.357100 kubelet[2614]: W0213 08:39:10.357074 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.357349 kubelet[2614]: E0213 08:39:10.357107 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.357681 kubelet[2614]: E0213 08:39:10.357609 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.357681 kubelet[2614]: W0213 08:39:10.357643 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.357681 kubelet[2614]: E0213 08:39:10.357682 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.358300 kubelet[2614]: E0213 08:39:10.358209 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.358300 kubelet[2614]: W0213 08:39:10.358243 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.358300 kubelet[2614]: E0213 08:39:10.358282 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.389912 kubelet[2614]: E0213 08:39:10.389805 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.389912 kubelet[2614]: W0213 08:39:10.389847 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.389912 kubelet[2614]: E0213 08:39:10.389893 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.390588 kubelet[2614]: E0213 08:39:10.390497 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.390588 kubelet[2614]: W0213 08:39:10.390531 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.390588 kubelet[2614]: E0213 08:39:10.390587 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.391296 kubelet[2614]: E0213 08:39:10.391204 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.391296 kubelet[2614]: W0213 08:39:10.391240 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.391296 kubelet[2614]: E0213 08:39:10.391295 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.391893 kubelet[2614]: E0213 08:39:10.391858 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.391893 kubelet[2614]: W0213 08:39:10.391893 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.392199 kubelet[2614]: E0213 08:39:10.391968 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.392609 kubelet[2614]: E0213 08:39:10.392519 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.392609 kubelet[2614]: W0213 08:39:10.392554 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.392966 kubelet[2614]: E0213 08:39:10.392696 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.393107 kubelet[2614]: E0213 08:39:10.393086 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.393240 kubelet[2614]: W0213 08:39:10.393112 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.393240 kubelet[2614]: E0213 08:39:10.393155 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.393748 kubelet[2614]: E0213 08:39:10.393656 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.393748 kubelet[2614]: W0213 08:39:10.393691 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.393748 kubelet[2614]: E0213 08:39:10.393737 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.394376 kubelet[2614]: E0213 08:39:10.394284 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.394376 kubelet[2614]: W0213 08:39:10.394319 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.394719 kubelet[2614]: E0213 08:39:10.394447 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.395028 kubelet[2614]: E0213 08:39:10.394924 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.395028 kubelet[2614]: W0213 08:39:10.394987 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.395387 kubelet[2614]: E0213 08:39:10.395043 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.395773 kubelet[2614]: E0213 08:39:10.395702 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.395773 kubelet[2614]: W0213 08:39:10.395737 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.396085 kubelet[2614]: E0213 08:39:10.395782 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.396530 kubelet[2614]: E0213 08:39:10.396438 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.396530 kubelet[2614]: W0213 08:39:10.396474 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.396530 kubelet[2614]: E0213 08:39:10.396522 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.397123 kubelet[2614]: E0213 08:39:10.397035 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:10.397123 kubelet[2614]: W0213 08:39:10.397063 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:10.397123 kubelet[2614]: E0213 08:39:10.397099 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:10.436117 kubelet[2614]: E0213 08:39:10.436051 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:11.264817 kubelet[2614]: E0213 08:39:11.264799 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:11.587752 systemd[1]: Started sshd@68-145.40.67.89:22-139.178.68.195:44444.service. Feb 13 08:39:11.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.67.89:22-139.178.68.195:44444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:11.614991 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:39:11.615061 kernel: audit: type=1130 audit(1707813551.587:1685): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.67.89:22-139.178.68.195:44444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:11.732000 audit[5165]: USER_ACCT pid=5165 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.733415 sshd[5165]: Accepted publickey for core from 139.178.68.195 port 44444 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:11.734407 sshd[5165]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:11.736813 systemd-logind[1463]: New session 54 of user core. Feb 13 08:39:11.737327 systemd[1]: Started session-54.scope. Feb 13 08:39:11.815548 sshd[5165]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:11.816812 systemd[1]: sshd@68-145.40.67.89:22-139.178.68.195:44444.service: Deactivated successfully. Feb 13 08:39:11.817238 systemd[1]: session-54.scope: Deactivated successfully. Feb 13 08:39:11.817619 systemd-logind[1463]: Session 54 logged out. Waiting for processes to exit. Feb 13 08:39:11.818146 systemd-logind[1463]: Removed session 54. Feb 13 08:39:11.733000 audit[5165]: CRED_ACQ pid=5165 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.915236 kernel: audit: type=1101 audit(1707813551.732:1686): pid=5165 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.915275 kernel: audit: type=1103 audit(1707813551.733:1687): pid=5165 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.915295 kernel: audit: type=1006 audit(1707813551.733:1688): pid=5165 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Feb 13 08:39:11.973799 kernel: audit: type=1300 audit(1707813551.733:1688): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0a2219c0 a2=3 a3=0 items=0 ppid=1 pid=5165 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:11.733000 audit[5165]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc0a2219c0 a2=3 a3=0 items=0 ppid=1 pid=5165 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:12.065807 kernel: audit: type=1327 audit(1707813551.733:1688): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:11.733000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:12.096287 kernel: audit: type=1105 audit(1707813551.738:1689): pid=5165 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.738000 audit[5165]: USER_START pid=5165 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:12.190730 kernel: audit: type=1103 audit(1707813551.739:1690): pid=5167 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.739000 audit[5167]: CRED_ACQ pid=5167 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:12.279889 kernel: audit: type=1106 audit(1707813551.815:1691): pid=5165 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.815000 audit[5165]: USER_END pid=5165 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:12.375370 kernel: audit: type=1104 audit(1707813551.815:1692): pid=5165 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.815000 audit[5165]: CRED_DISP pid=5165 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:11.816000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-145.40.67.89:22-139.178.68.195:44444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:13.264898 kubelet[2614]: E0213 08:39:13.264850 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:13.279350 kubelet[2614]: E0213 08:39:13.279318 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:13.279350 kubelet[2614]: W0213 08:39:13.279327 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:13.279350 kubelet[2614]: E0213 08:39:13.279338 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:13.279542 kubelet[2614]: E0213 08:39:13.279507 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:13.279542 kubelet[2614]: W0213 08:39:13.279515 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:13.279542 kubelet[2614]: E0213 08:39:13.279524 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:13.279723 kubelet[2614]: E0213 08:39:13.279691 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:13.279723 kubelet[2614]: W0213 08:39:13.279698 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:13.279723 kubelet[2614]: E0213 08:39:13.279709 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:13.279850 kubelet[2614]: E0213 08:39:13.279844 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:39:13.279850 kubelet[2614]: W0213 08:39:13.279849 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:39:13.279902 kubelet[2614]: E0213 08:39:13.279857 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:39:13.757815 systemd[1]: Started sshd@69-145.40.67.89:22-161.35.108.241:38984.service. Feb 13 08:39:13.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.67.89:22-161.35.108.241:38984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:14.198216 sshd[5194]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:39:14.197000 audit[5194]: ANOM_LOGIN_FAILURES pid=5194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:14.197000 audit[5194]: USER_AUTH pid=5194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:39:14.198465 sshd[5194]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:39:15.264238 kubelet[2614]: E0213 08:39:15.264221 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:15.344013 systemd[1]: Started sshd@70-145.40.67.89:22-43.153.15.221:51364.service. Feb 13 08:39:15.343000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.67.89:22-43.153.15.221:51364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:15.436684 kubelet[2614]: E0213 08:39:15.436638 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:15.457793 sshd[5197]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:39:15.456000 audit[5197]: USER_AUTH pid=5197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:39:15.733224 sshd[5194]: Failed password for root from 161.35.108.241 port 38984 ssh2 Feb 13 08:39:16.824559 systemd[1]: Started sshd@71-145.40.67.89:22-139.178.68.195:33770.service. Feb 13 08:39:16.823000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.67.89:22-139.178.68.195:33770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:16.851658 kernel: kauditd_printk_skb: 6 callbacks suppressed Feb 13 08:39:16.851711 kernel: audit: type=1130 audit(1707813556.823:1699): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.67.89:22-139.178.68.195:33770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:16.969356 sshd[5200]: Accepted publickey for core from 139.178.68.195 port 33770 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:16.968000 audit[5200]: USER_ACCT pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:16.970609 sshd[5200]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:16.972912 systemd-logind[1463]: New session 55 of user core. Feb 13 08:39:16.973439 systemd[1]: Started session-55.scope. Feb 13 08:39:17.054621 sshd[5200]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:17.056036 systemd[1]: sshd@71-145.40.67.89:22-139.178.68.195:33770.service: Deactivated successfully. Feb 13 08:39:17.056532 systemd[1]: session-55.scope: Deactivated successfully. Feb 13 08:39:17.056867 systemd-logind[1463]: Session 55 logged out. Waiting for processes to exit. Feb 13 08:39:17.057375 systemd-logind[1463]: Removed session 55. Feb 13 08:39:16.969000 audit[5200]: CRED_ACQ pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.119478 sshd[5194]: Received disconnect from 161.35.108.241 port 38984:11: Bye Bye [preauth] Feb 13 08:39:17.119478 sshd[5194]: Disconnected from authenticating user root 161.35.108.241 port 38984 [preauth] Feb 13 08:39:17.119900 systemd[1]: sshd@69-145.40.67.89:22-161.35.108.241:38984.service: Deactivated successfully. Feb 13 08:39:17.153750 kernel: audit: type=1101 audit(1707813556.968:1700): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.153784 kernel: audit: type=1103 audit(1707813556.969:1701): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.153799 kernel: audit: type=1006 audit(1707813556.969:1702): pid=5200 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=55 res=1 Feb 13 08:39:17.212376 kernel: audit: type=1300 audit(1707813556.969:1702): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc17dd5860 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:16.969000 audit[5200]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc17dd5860 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:17.264539 kubelet[2614]: E0213 08:39:17.264500 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:17.304367 kernel: audit: type=1327 audit(1707813556.969:1702): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:16.969000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:17.334889 kernel: audit: type=1105 audit(1707813556.974:1703): pid=5200 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:16.974000 audit[5200]: USER_START pid=5200 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.429378 kernel: audit: type=1103 audit(1707813556.975:1704): pid=5202 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:16.975000 audit[5202]: CRED_ACQ pid=5202 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.464025 sshd[5197]: Failed password for root from 43.153.15.221 port 51364 ssh2 Feb 13 08:39:17.518586 kernel: audit: type=1106 audit(1707813557.054:1705): pid=5200 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.054000 audit[5200]: USER_END pid=5200 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.614224 kernel: audit: type=1104 audit(1707813557.054:1706): pid=5200 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.054000 audit[5200]: CRED_DISP pid=5200 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:17.055000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-145.40.67.89:22-139.178.68.195:33770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:17.119000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-145.40.67.89:22-161.35.108.241:38984 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:18.321969 sshd[5197]: Received disconnect from 43.153.15.221 port 51364:11: Bye Bye [preauth] Feb 13 08:39:18.321969 sshd[5197]: Disconnected from authenticating user root 43.153.15.221 port 51364 [preauth] Feb 13 08:39:18.324355 systemd[1]: sshd@70-145.40.67.89:22-43.153.15.221:51364.service: Deactivated successfully. Feb 13 08:39:18.323000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-145.40.67.89:22-43.153.15.221:51364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:19.264937 kubelet[2614]: E0213 08:39:19.264886 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:20.438429 kubelet[2614]: E0213 08:39:20.438369 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:21.264217 kubelet[2614]: E0213 08:39:21.264173 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:22.064208 systemd[1]: Started sshd@72-145.40.67.89:22-139.178.68.195:33786.service. Feb 13 08:39:22.063000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.67.89:22-139.178.68.195:33786 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:22.091059 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:39:22.091128 kernel: audit: type=1130 audit(1707813562.063:1710): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.67.89:22-139.178.68.195:33786 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:22.207000 audit[5227]: USER_ACCT pid=5227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.208400 sshd[5227]: Accepted publickey for core from 139.178.68.195 port 33786 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:22.210526 sshd[5227]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:22.212612 systemd-logind[1463]: New session 56 of user core. Feb 13 08:39:22.213345 systemd[1]: Started session-56.scope. Feb 13 08:39:22.209000 audit[5227]: CRED_ACQ pid=5227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.302434 sshd[5227]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:22.303671 systemd[1]: sshd@72-145.40.67.89:22-139.178.68.195:33786.service: Deactivated successfully. Feb 13 08:39:22.304091 systemd[1]: session-56.scope: Deactivated successfully. Feb 13 08:39:22.304514 systemd-logind[1463]: Session 56 logged out. Waiting for processes to exit. Feb 13 08:39:22.304917 systemd-logind[1463]: Removed session 56. Feb 13 08:39:22.392797 kernel: audit: type=1101 audit(1707813562.207:1711): pid=5227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.392839 kernel: audit: type=1103 audit(1707813562.209:1712): pid=5227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.392853 kernel: audit: type=1006 audit(1707813562.209:1713): pid=5227 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=56 res=1 Feb 13 08:39:22.451401 kernel: audit: type=1300 audit(1707813562.209:1713): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc70d66050 a2=3 a3=0 items=0 ppid=1 pid=5227 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:22.209000 audit[5227]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc70d66050 a2=3 a3=0 items=0 ppid=1 pid=5227 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:22.543419 kernel: audit: type=1327 audit(1707813562.209:1713): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:22.209000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:22.573937 kernel: audit: type=1105 audit(1707813562.214:1714): pid=5227 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.214000 audit[5227]: USER_START pid=5227 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.668450 kernel: audit: type=1103 audit(1707813562.215:1715): pid=5229 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.215000 audit[5229]: CRED_ACQ pid=5229 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.757644 kernel: audit: type=1106 audit(1707813562.301:1716): pid=5227 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.301000 audit[5227]: USER_END pid=5227 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.853137 kernel: audit: type=1104 audit(1707813562.302:1717): pid=5227 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.302000 audit[5227]: CRED_DISP pid=5227 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:22.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-145.40.67.89:22-139.178.68.195:33786 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:23.265114 kubelet[2614]: E0213 08:39:23.265068 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:25.265240 kubelet[2614]: E0213 08:39:25.265194 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:25.439986 kubelet[2614]: E0213 08:39:25.439878 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:27.264811 kubelet[2614]: E0213 08:39:27.264766 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:27.313678 systemd[1]: Started sshd@73-145.40.67.89:22-139.178.68.195:46942.service. Feb 13 08:39:27.313000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.67.89:22-139.178.68.195:46942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:27.341011 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:39:27.341136 kernel: audit: type=1130 audit(1707813567.313:1719): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.67.89:22-139.178.68.195:46942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:27.459013 sshd[5254]: Accepted publickey for core from 139.178.68.195 port 46942 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:27.458000 audit[5254]: USER_ACCT pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.461796 sshd[5254]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:27.464262 systemd-logind[1463]: New session 57 of user core. Feb 13 08:39:27.464831 systemd[1]: Started session-57.scope. Feb 13 08:39:27.542292 sshd[5254]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:27.543599 systemd[1]: sshd@73-145.40.67.89:22-139.178.68.195:46942.service: Deactivated successfully. Feb 13 08:39:27.544031 systemd[1]: session-57.scope: Deactivated successfully. Feb 13 08:39:27.544441 systemd-logind[1463]: Session 57 logged out. Waiting for processes to exit. Feb 13 08:39:27.544908 systemd-logind[1463]: Removed session 57. Feb 13 08:39:27.460000 audit[5254]: CRED_ACQ pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.640858 kernel: audit: type=1101 audit(1707813567.458:1720): pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.640911 kernel: audit: type=1103 audit(1707813567.460:1721): pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.640960 kernel: audit: type=1006 audit(1707813567.460:1722): pid=5254 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=57 res=1 Feb 13 08:39:27.460000 audit[5254]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff43770080 a2=3 a3=0 items=0 ppid=1 pid=5254 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:27.700004 kernel: audit: type=1300 audit(1707813567.460:1722): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff43770080 a2=3 a3=0 items=0 ppid=1 pid=5254 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:27.460000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:27.821995 kernel: audit: type=1327 audit(1707813567.460:1722): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:27.822030 kernel: audit: type=1105 audit(1707813567.466:1723): pid=5254 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.466000 audit[5254]: USER_START pid=5254 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.916550 kernel: audit: type=1103 audit(1707813567.466:1724): pid=5256 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.466000 audit[5256]: CRED_ACQ pid=5256 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:28.005751 kernel: audit: type=1106 audit(1707813567.541:1725): pid=5254 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.541000 audit[5254]: USER_END pid=5254 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:28.101290 kernel: audit: type=1104 audit(1707813567.542:1726): pid=5254 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.542000 audit[5254]: CRED_DISP pid=5254 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:27.542000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-145.40.67.89:22-139.178.68.195:46942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:29.265134 kubelet[2614]: E0213 08:39:29.265089 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:30.442137 kubelet[2614]: E0213 08:39:30.441980 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:31.264621 kubelet[2614]: E0213 08:39:31.264572 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:32.551829 systemd[1]: Started sshd@74-145.40.67.89:22-139.178.68.195:46958.service. Feb 13 08:39:32.550000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.67.89:22-139.178.68.195:46958 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:32.578900 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:39:32.578976 kernel: audit: type=1130 audit(1707813572.550:1728): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.67.89:22-139.178.68.195:46958 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:32.696000 audit[5279]: USER_ACCT pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.697874 sshd[5279]: Accepted publickey for core from 139.178.68.195 port 46958 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:32.699280 sshd[5279]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:32.701697 systemd-logind[1463]: New session 58 of user core. Feb 13 08:39:32.702209 systemd[1]: Started session-58.scope. Feb 13 08:39:32.781637 sshd[5279]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:32.782999 systemd[1]: sshd@74-145.40.67.89:22-139.178.68.195:46958.service: Deactivated successfully. Feb 13 08:39:32.783399 systemd[1]: session-58.scope: Deactivated successfully. Feb 13 08:39:32.783760 systemd-logind[1463]: Session 58 logged out. Waiting for processes to exit. Feb 13 08:39:32.784348 systemd-logind[1463]: Removed session 58. Feb 13 08:39:32.698000 audit[5279]: CRED_ACQ pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.879738 kernel: audit: type=1101 audit(1707813572.696:1729): pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.879771 kernel: audit: type=1103 audit(1707813572.698:1730): pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.879790 kernel: audit: type=1006 audit(1707813572.698:1731): pid=5279 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=58 res=1 Feb 13 08:39:32.938273 kernel: audit: type=1300 audit(1707813572.698:1731): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6d808960 a2=3 a3=0 items=0 ppid=1 pid=5279 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:32.698000 audit[5279]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6d808960 a2=3 a3=0 items=0 ppid=1 pid=5279 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:33.030221 kernel: audit: type=1327 audit(1707813572.698:1731): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:32.698000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:33.060694 kernel: audit: type=1105 audit(1707813572.703:1732): pid=5279 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.703000 audit[5279]: USER_START pid=5279 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.703000 audit[5281]: CRED_ACQ pid=5281 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:33.244351 kernel: audit: type=1103 audit(1707813572.703:1733): pid=5281 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:33.244414 kernel: audit: type=1106 audit(1707813572.781:1734): pid=5279 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.781000 audit[5279]: USER_END pid=5279 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:33.264289 kubelet[2614]: E0213 08:39:33.264250 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:33.339918 kernel: audit: type=1104 audit(1707813572.781:1735): pid=5279 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.781000 audit[5279]: CRED_DISP pid=5279 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:32.782000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-145.40.67.89:22-139.178.68.195:46958 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:35.264447 kubelet[2614]: E0213 08:39:35.264399 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:35.443553 kubelet[2614]: E0213 08:39:35.443484 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:35.610000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.610000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0009fcda0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:39:35.610000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:39:35.610000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.610000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c00204be30 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:39:35.610000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:39:35.893000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.893000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c002e6f140 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:39:35.893000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:39:35.893000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.893000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.893000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00bc7da40 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:39:35.893000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c01a381ef0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:39:35.893000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:39:35.893000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:39:35.893000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.893000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c003993620 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:39:35.893000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:39:35.893000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.893000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c0141b4ed0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:39:35.893000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:39:35.893000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:35.893000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0078c0900 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:39:35.893000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:39:37.265082 kubelet[2614]: E0213 08:39:37.265036 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:37.790576 systemd[1]: Started sshd@75-145.40.67.89:22-139.178.68.195:49942.service. Feb 13 08:39:37.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.67.89:22-139.178.68.195:49942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:37.817623 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:39:37.817704 kernel: audit: type=1130 audit(1707813577.789:1745): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.67.89:22-139.178.68.195:49942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:37.935000 audit[5304]: USER_ACCT pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:37.936401 sshd[5304]: Accepted publickey for core from 139.178.68.195 port 49942 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:37.937210 sshd[5304]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:37.939513 systemd-logind[1463]: New session 59 of user core. Feb 13 08:39:37.939954 systemd[1]: Started session-59.scope. Feb 13 08:39:38.019204 sshd[5304]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:38.020476 systemd[1]: sshd@75-145.40.67.89:22-139.178.68.195:49942.service: Deactivated successfully. Feb 13 08:39:38.020891 systemd[1]: session-59.scope: Deactivated successfully. Feb 13 08:39:38.021190 systemd-logind[1463]: Session 59 logged out. Waiting for processes to exit. Feb 13 08:39:38.021725 systemd-logind[1463]: Removed session 59. Feb 13 08:39:37.936000 audit[5304]: CRED_ACQ pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.118233 kernel: audit: type=1101 audit(1707813577.935:1746): pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.118323 kernel: audit: type=1103 audit(1707813577.936:1747): pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.118345 kernel: audit: type=1006 audit(1707813577.936:1748): pid=5304 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=59 res=1 Feb 13 08:39:38.176911 kernel: audit: type=1300 audit(1707813577.936:1748): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6b1cced0 a2=3 a3=0 items=0 ppid=1 pid=5304 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:37.936000 audit[5304]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff6b1cced0 a2=3 a3=0 items=0 ppid=1 pid=5304 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:38.269029 kernel: audit: type=1327 audit(1707813577.936:1748): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:37.936000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:38.299513 kernel: audit: type=1105 audit(1707813577.941:1749): pid=5304 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:37.941000 audit[5304]: USER_START pid=5304 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.394038 kernel: audit: type=1103 audit(1707813577.941:1750): pid=5306 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:37.941000 audit[5306]: CRED_ACQ pid=5306 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.483346 kernel: audit: type=1106 audit(1707813578.018:1751): pid=5304 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.018000 audit[5304]: USER_END pid=5304 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.578889 kernel: audit: type=1104 audit(1707813578.018:1752): pid=5304 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.018000 audit[5304]: CRED_DISP pid=5304 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:38.019000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-145.40.67.89:22-139.178.68.195:49942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:39.265023 kubelet[2614]: E0213 08:39:39.265004 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:40.445272 kubelet[2614]: E0213 08:39:40.445164 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:40.950000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:40.950000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c003021680 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:39:40.950000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:39:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000557ca0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:39:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:39:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c0009fcee0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:39:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:39:40.952000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:39:40.952000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002ebb560 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:39:40.952000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:39:41.264852 kubelet[2614]: E0213 08:39:41.264804 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:43.027735 systemd[1]: Started sshd@76-145.40.67.89:22-139.178.68.195:49956.service. Feb 13 08:39:43.026000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.67.89:22-139.178.68.195:49956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:43.055013 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:39:43.055081 kernel: audit: type=1130 audit(1707813583.026:1758): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.67.89:22-139.178.68.195:49956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:43.172000 audit[5331]: USER_ACCT pid=5331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.173489 sshd[5331]: Accepted publickey for core from 139.178.68.195 port 49956 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:43.174971 sshd[5331]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:43.177320 systemd-logind[1463]: New session 60 of user core. Feb 13 08:39:43.177802 systemd[1]: Started session-60.scope. Feb 13 08:39:43.254788 sshd[5331]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:43.256173 systemd[1]: sshd@76-145.40.67.89:22-139.178.68.195:49956.service: Deactivated successfully. Feb 13 08:39:43.256596 systemd[1]: session-60.scope: Deactivated successfully. Feb 13 08:39:43.256889 systemd-logind[1463]: Session 60 logged out. Waiting for processes to exit. Feb 13 08:39:43.257369 systemd-logind[1463]: Removed session 60. Feb 13 08:39:43.173000 audit[5331]: CRED_ACQ pid=5331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.264963 kernel: audit: type=1101 audit(1707813583.172:1759): pid=5331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.264984 kernel: audit: type=1103 audit(1707813583.173:1760): pid=5331 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.264998 kubelet[2614]: E0213 08:39:43.264893 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:43.414003 kernel: audit: type=1006 audit(1707813583.173:1761): pid=5331 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=60 res=1 Feb 13 08:39:43.414039 kernel: audit: type=1300 audit(1707813583.173:1761): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc13243a80 a2=3 a3=0 items=0 ppid=1 pid=5331 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:43.173000 audit[5331]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc13243a80 a2=3 a3=0 items=0 ppid=1 pid=5331 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:43.173000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:43.536656 kernel: audit: type=1327 audit(1707813583.173:1761): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:43.536685 kernel: audit: type=1105 audit(1707813583.179:1762): pid=5331 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.179000 audit[5331]: USER_START pid=5331 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.631169 kernel: audit: type=1103 audit(1707813583.179:1763): pid=5333 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.179000 audit[5333]: CRED_ACQ pid=5333 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.254000 audit[5331]: USER_END pid=5331 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.816017 kernel: audit: type=1106 audit(1707813583.254:1764): pid=5331 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.816047 kernel: audit: type=1104 audit(1707813583.254:1765): pid=5331 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.254000 audit[5331]: CRED_DISP pid=5331 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:43.255000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-145.40.67.89:22-139.178.68.195:49956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:44.465548 systemd[1]: Started sshd@77-145.40.67.89:22-61.83.148.111:36486.service. Feb 13 08:39:44.465000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.67.89:22-61.83.148.111:36486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:45.264461 kubelet[2614]: E0213 08:39:45.264415 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:45.298746 sshd[5355]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=61.83.148.111 user=root Feb 13 08:39:45.297000 audit[5355]: ANOM_LOGIN_FAILURES pid=5355 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:45.297000 audit[5355]: USER_AUTH pid=5355 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=61.83.148.111 addr=61.83.148.111 terminal=ssh res=failed' Feb 13 08:39:45.298840 sshd[5355]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:39:45.447023 kubelet[2614]: E0213 08:39:45.446949 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:47.264262 kubelet[2614]: E0213 08:39:47.264217 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:47.425199 sshd[5355]: Failed password for root from 61.83.148.111 port 36486 ssh2 Feb 13 08:39:48.259232 systemd[1]: Started sshd@78-145.40.67.89:22-139.178.68.195:33960.service. Feb 13 08:39:48.258000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.67.89:22-139.178.68.195:33960 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:48.286429 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:39:48.286514 kernel: audit: type=1130 audit(1707813588.258:1770): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.67.89:22-139.178.68.195:33960 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:48.292615 sshd[5355]: Received disconnect from 61.83.148.111 port 36486:11: Bye Bye [preauth] Feb 13 08:39:48.292615 sshd[5355]: Disconnected from authenticating user root 61.83.148.111 port 36486 [preauth] Feb 13 08:39:48.293319 systemd[1]: sshd@77-145.40.67.89:22-61.83.148.111:36486.service: Deactivated successfully. Feb 13 08:39:48.292000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.67.89:22-61.83.148.111:36486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:48.376993 kernel: audit: type=1131 audit(1707813588.292:1771): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-145.40.67.89:22-61.83.148.111:36486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:48.404915 sshd[5358]: Accepted publickey for core from 139.178.68.195 port 33960 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:48.406249 sshd[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:48.408937 systemd-logind[1463]: New session 61 of user core. Feb 13 08:39:48.409395 systemd[1]: Started session-61.scope. Feb 13 08:39:48.403000 audit[5358]: USER_ACCT pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.487259 sshd[5358]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:48.488687 systemd[1]: sshd@78-145.40.67.89:22-139.178.68.195:33960.service: Deactivated successfully. Feb 13 08:39:48.489092 systemd[1]: session-61.scope: Deactivated successfully. Feb 13 08:39:48.489455 systemd-logind[1463]: Session 61 logged out. Waiting for processes to exit. Feb 13 08:39:48.489869 systemd-logind[1463]: Removed session 61. Feb 13 08:39:48.405000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.646464 kernel: audit: type=1101 audit(1707813588.403:1772): pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.646495 kernel: audit: type=1103 audit(1707813588.405:1773): pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.646509 kernel: audit: type=1006 audit(1707813588.405:1774): pid=5358 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=61 res=1 Feb 13 08:39:48.705014 kernel: audit: type=1300 audit(1707813588.405:1774): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6cb15140 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:48.405000 audit[5358]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6cb15140 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:48.797011 kernel: audit: type=1327 audit(1707813588.405:1774): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:48.405000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:48.827537 kernel: audit: type=1105 audit(1707813588.410:1775): pid=5358 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.410000 audit[5358]: USER_START pid=5358 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.922870 kernel: audit: type=1103 audit(1707813588.411:1776): pid=5361 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.411000 audit[5361]: CRED_ACQ pid=5361 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:49.012119 kernel: audit: type=1106 audit(1707813588.486:1777): pid=5358 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.486000 audit[5358]: USER_END pid=5358 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.487000 audit[5358]: CRED_DISP pid=5358 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:48.487000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-145.40.67.89:22-139.178.68.195:33960 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:49.264822 kubelet[2614]: E0213 08:39:49.264630 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:50.449010 kubelet[2614]: E0213 08:39:50.448914 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:51.264879 kubelet[2614]: E0213 08:39:51.264834 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:53.264962 kubelet[2614]: E0213 08:39:53.264916 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:53.496730 systemd[1]: Started sshd@79-145.40.67.89:22-139.178.68.195:33962.service. Feb 13 08:39:53.495000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.67.89:22-139.178.68.195:33962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:53.523746 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:39:53.523819 kernel: audit: type=1130 audit(1707813593.495:1780): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.67.89:22-139.178.68.195:33962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:53.642000 audit[5386]: USER_ACCT pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.642415 sshd[5386]: Accepted publickey for core from 139.178.68.195 port 33962 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:53.645246 sshd[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:53.647744 systemd-logind[1463]: New session 62 of user core. Feb 13 08:39:53.648220 systemd[1]: Started session-62.scope. Feb 13 08:39:53.728549 sshd[5386]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:53.729867 systemd[1]: sshd@79-145.40.67.89:22-139.178.68.195:33962.service: Deactivated successfully. Feb 13 08:39:53.730294 systemd[1]: session-62.scope: Deactivated successfully. Feb 13 08:39:53.730630 systemd-logind[1463]: Session 62 logged out. Waiting for processes to exit. Feb 13 08:39:53.731075 systemd-logind[1463]: Removed session 62. Feb 13 08:39:53.644000 audit[5386]: CRED_ACQ pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.825167 kernel: audit: type=1101 audit(1707813593.642:1781): pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.825202 kernel: audit: type=1103 audit(1707813593.644:1782): pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.825218 kernel: audit: type=1006 audit(1707813593.644:1783): pid=5386 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Feb 13 08:39:53.884128 kernel: audit: type=1300 audit(1707813593.644:1783): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc274b4720 a2=3 a3=0 items=0 ppid=1 pid=5386 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:53.644000 audit[5386]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc274b4720 a2=3 a3=0 items=0 ppid=1 pid=5386 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:53.976639 kernel: audit: type=1327 audit(1707813593.644:1783): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:53.644000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:54.007315 kernel: audit: type=1105 audit(1707813593.649:1784): pid=5386 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.649000 audit[5386]: USER_START pid=5386 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:54.102302 kernel: audit: type=1103 audit(1707813593.650:1785): pid=5388 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.650000 audit[5388]: CRED_ACQ pid=5388 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:54.192008 kernel: audit: type=1106 audit(1707813593.728:1786): pid=5386 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.728000 audit[5386]: USER_END pid=5386 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:54.288012 kernel: audit: type=1104 audit(1707813593.728:1787): pid=5386 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.728000 audit[5386]: CRED_DISP pid=5386 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:53.729000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-145.40.67.89:22-139.178.68.195:33962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:55.264290 kubelet[2614]: E0213 08:39:55.264245 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:55.451140 kubelet[2614]: E0213 08:39:55.451062 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:39:57.264702 kubelet[2614]: E0213 08:39:57.264655 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:58.739844 systemd[1]: Started sshd@80-145.40.67.89:22-139.178.68.195:47854.service. Feb 13 08:39:58.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.67.89:22-139.178.68.195:47854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:58.766766 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:39:58.766803 kernel: audit: type=1130 audit(1707813598.739:1789): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.67.89:22-139.178.68.195:47854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:39:58.883000 audit[5411]: USER_ACCT pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.884801 sshd[5411]: Accepted publickey for core from 139.178.68.195 port 47854 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:39:58.887887 sshd[5411]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:39:58.897275 systemd-logind[1463]: New session 63 of user core. Feb 13 08:39:58.900107 systemd[1]: Started session-63.scope. Feb 13 08:39:58.885000 audit[5411]: CRED_ACQ pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.994917 sshd[5411]: pam_unix(sshd:session): session closed for user core Feb 13 08:39:58.996441 systemd[1]: sshd@80-145.40.67.89:22-139.178.68.195:47854.service: Deactivated successfully. Feb 13 08:39:58.996851 systemd[1]: session-63.scope: Deactivated successfully. Feb 13 08:39:58.997278 systemd-logind[1463]: Session 63 logged out. Waiting for processes to exit. Feb 13 08:39:58.997735 systemd-logind[1463]: Removed session 63. Feb 13 08:39:59.071628 kernel: audit: type=1101 audit(1707813598.883:1790): pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:59.071660 kernel: audit: type=1103 audit(1707813598.885:1791): pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:59.071674 kernel: audit: type=1006 audit(1707813598.886:1792): pid=5411 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Feb 13 08:39:59.130556 kernel: audit: type=1300 audit(1707813598.886:1792): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff02450240 a2=3 a3=0 items=0 ppid=1 pid=5411 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:58.886000 audit[5411]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff02450240 a2=3 a3=0 items=0 ppid=1 pid=5411 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:39:59.223021 kernel: audit: type=1327 audit(1707813598.886:1792): proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:58.886000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:39:59.253780 kernel: audit: type=1105 audit(1707813598.913:1793): pid=5411 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.913000 audit[5411]: USER_START pid=5411 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:59.264841 kubelet[2614]: E0213 08:39:59.264803 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:39:59.348770 kernel: audit: type=1103 audit(1707813598.916:1794): pid=5413 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.916000 audit[5413]: CRED_ACQ pid=5413 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:59.438013 kernel: audit: type=1106 audit(1707813598.994:1795): pid=5411 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.994000 audit[5411]: USER_END pid=5411 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.994000 audit[5411]: CRED_DISP pid=5411 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:59.622846 kernel: audit: type=1104 audit(1707813598.994:1796): pid=5411 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:39:58.995000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-145.40.67.89:22-139.178.68.195:47854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:00.341161 kubelet[2614]: E0213 08:40:00.341061 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.341161 kubelet[2614]: W0213 08:40:00.341105 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.341161 kubelet[2614]: E0213 08:40:00.341156 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.342175 kubelet[2614]: E0213 08:40:00.341654 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.342175 kubelet[2614]: W0213 08:40:00.341685 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.342175 kubelet[2614]: E0213 08:40:00.341721 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.342469 kubelet[2614]: E0213 08:40:00.342198 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.342469 kubelet[2614]: W0213 08:40:00.342230 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.342469 kubelet[2614]: E0213 08:40:00.342265 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.342854 kubelet[2614]: E0213 08:40:00.342816 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.342854 kubelet[2614]: W0213 08:40:00.342850 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.343086 kubelet[2614]: E0213 08:40:00.342892 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.343473 kubelet[2614]: E0213 08:40:00.343397 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.343473 kubelet[2614]: W0213 08:40:00.343429 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.343473 kubelet[2614]: E0213 08:40:00.343468 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.344019 kubelet[2614]: E0213 08:40:00.343914 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.344019 kubelet[2614]: W0213 08:40:00.343964 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.344019 kubelet[2614]: E0213 08:40:00.343999 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.344622 kubelet[2614]: E0213 08:40:00.344545 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.344622 kubelet[2614]: W0213 08:40:00.344577 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.344622 kubelet[2614]: E0213 08:40:00.344613 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.345059 kubelet[2614]: E0213 08:40:00.345027 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.345059 kubelet[2614]: W0213 08:40:00.345049 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.345260 kubelet[2614]: E0213 08:40:00.345080 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.345607 kubelet[2614]: E0213 08:40:00.345532 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.345607 kubelet[2614]: W0213 08:40:00.345563 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.345607 kubelet[2614]: E0213 08:40:00.345600 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.346163 kubelet[2614]: E0213 08:40:00.346089 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.346163 kubelet[2614]: W0213 08:40:00.346115 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.346163 kubelet[2614]: E0213 08:40:00.346149 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.346666 kubelet[2614]: E0213 08:40:00.346609 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.346666 kubelet[2614]: W0213 08:40:00.346641 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.346894 kubelet[2614]: E0213 08:40:00.346677 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.347212 kubelet[2614]: E0213 08:40:00.347137 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.347212 kubelet[2614]: W0213 08:40:00.347169 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.347212 kubelet[2614]: E0213 08:40:00.347205 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.347732 kubelet[2614]: E0213 08:40:00.347679 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.347732 kubelet[2614]: W0213 08:40:00.347705 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.348000 kubelet[2614]: E0213 08:40:00.347742 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.348317 kubelet[2614]: E0213 08:40:00.348247 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.348317 kubelet[2614]: W0213 08:40:00.348279 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.348317 kubelet[2614]: E0213 08:40:00.348316 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.348789 kubelet[2614]: E0213 08:40:00.348762 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.348789 kubelet[2614]: W0213 08:40:00.348788 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.349031 kubelet[2614]: E0213 08:40:00.348825 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.349356 kubelet[2614]: E0213 08:40:00.349324 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.349476 kubelet[2614]: W0213 08:40:00.349357 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.349476 kubelet[2614]: E0213 08:40:00.349393 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.349796 kubelet[2614]: E0213 08:40:00.349773 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.349910 kubelet[2614]: W0213 08:40:00.349796 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.349910 kubelet[2614]: E0213 08:40:00.349826 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.350303 kubelet[2614]: E0213 08:40:00.350273 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.350407 kubelet[2614]: W0213 08:40:00.350305 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.350407 kubelet[2614]: E0213 08:40:00.350341 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.350789 kubelet[2614]: E0213 08:40:00.350765 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.350890 kubelet[2614]: W0213 08:40:00.350789 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.350890 kubelet[2614]: E0213 08:40:00.350819 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.351360 kubelet[2614]: E0213 08:40:00.351283 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:00.351360 kubelet[2614]: W0213 08:40:00.351315 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:00.351360 kubelet[2614]: E0213 08:40:00.351351 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:00.452587 kubelet[2614]: E0213 08:40:00.452523 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:01.265144 kubelet[2614]: E0213 08:40:01.265099 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:03.264888 kubelet[2614]: E0213 08:40:03.264807 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:04.004175 systemd[1]: Started sshd@81-145.40.67.89:22-139.178.68.195:47856.service. Feb 13 08:40:04.003000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.67.89:22-139.178.68.195:47856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:04.031422 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:04.031478 kernel: audit: type=1130 audit(1707813604.003:1798): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.67.89:22-139.178.68.195:47856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:04.150000 audit[5456]: USER_ACCT pid=5456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.152125 sshd[5456]: Accepted publickey for core from 139.178.68.195 port 47856 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:04.155965 sshd[5456]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:04.159277 systemd-logind[1463]: New session 64 of user core. Feb 13 08:40:04.159691 systemd[1]: Started session-64.scope. Feb 13 08:40:04.240171 sshd[5456]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:04.241613 systemd[1]: sshd@81-145.40.67.89:22-139.178.68.195:47856.service: Deactivated successfully. Feb 13 08:40:04.242043 systemd[1]: session-64.scope: Deactivated successfully. Feb 13 08:40:04.242459 systemd-logind[1463]: Session 64 logged out. Waiting for processes to exit. Feb 13 08:40:04.242848 systemd-logind[1463]: Removed session 64. Feb 13 08:40:04.154000 audit[5456]: CRED_ACQ pid=5456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.243997 kernel: audit: type=1101 audit(1707813604.150:1799): pid=5456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.244029 kernel: audit: type=1103 audit(1707813604.154:1800): pid=5456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.392358 kernel: audit: type=1006 audit(1707813604.154:1801): pid=5456 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Feb 13 08:40:04.392398 kernel: audit: type=1300 audit(1707813604.154:1801): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcfc845430 a2=3 a3=0 items=0 ppid=1 pid=5456 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:04.154000 audit[5456]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcfc845430 a2=3 a3=0 items=0 ppid=1 pid=5456 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:04.154000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:04.514840 kernel: audit: type=1327 audit(1707813604.154:1801): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:04.514930 kernel: audit: type=1105 audit(1707813604.160:1802): pid=5456 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.160000 audit[5456]: USER_START pid=5456 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.609387 kernel: audit: type=1103 audit(1707813604.161:1803): pid=5458 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.161000 audit[5458]: CRED_ACQ pid=5458 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.239000 audit[5456]: USER_END pid=5456 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.794185 kernel: audit: type=1106 audit(1707813604.239:1804): pid=5456 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.794264 kernel: audit: type=1104 audit(1707813604.240:1805): pid=5456 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.240000 audit[5456]: CRED_DISP pid=5456 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:04.240000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-145.40.67.89:22-139.178.68.195:47856 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:05.264438 kubelet[2614]: E0213 08:40:05.264339 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:05.454012 kubelet[2614]: E0213 08:40:05.453961 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:07.264482 kubelet[2614]: E0213 08:40:07.264439 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:08.311744 kubelet[2614]: E0213 08:40:08.311691 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.311744 kubelet[2614]: W0213 08:40:08.311734 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.312628 kubelet[2614]: E0213 08:40:08.311775 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.312628 kubelet[2614]: E0213 08:40:08.312171 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.312628 kubelet[2614]: W0213 08:40:08.312193 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.312628 kubelet[2614]: E0213 08:40:08.312220 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.312628 kubelet[2614]: E0213 08:40:08.312554 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.312628 kubelet[2614]: W0213 08:40:08.312576 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.312628 kubelet[2614]: E0213 08:40:08.312609 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.313247 kubelet[2614]: E0213 08:40:08.313067 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.313247 kubelet[2614]: W0213 08:40:08.313092 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.313247 kubelet[2614]: E0213 08:40:08.313124 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.313513 kubelet[2614]: E0213 08:40:08.313480 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.313513 kubelet[2614]: W0213 08:40:08.313501 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.313673 kubelet[2614]: E0213 08:40:08.313535 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.313898 kubelet[2614]: E0213 08:40:08.313873 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.313898 kubelet[2614]: W0213 08:40:08.313894 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.314155 kubelet[2614]: E0213 08:40:08.313918 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.314340 kubelet[2614]: E0213 08:40:08.314314 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.314340 kubelet[2614]: W0213 08:40:08.314335 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.314556 kubelet[2614]: E0213 08:40:08.314361 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.314741 kubelet[2614]: E0213 08:40:08.314716 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.314741 kubelet[2614]: W0213 08:40:08.314737 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.314975 kubelet[2614]: E0213 08:40:08.314764 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.315102 kubelet[2614]: E0213 08:40:08.315075 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.315102 kubelet[2614]: W0213 08:40:08.315098 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.315306 kubelet[2614]: E0213 08:40:08.315124 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.315506 kubelet[2614]: E0213 08:40:08.315475 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.315506 kubelet[2614]: W0213 08:40:08.315497 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.315771 kubelet[2614]: E0213 08:40:08.315530 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.315865 kubelet[2614]: E0213 08:40:08.315853 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.315989 kubelet[2614]: W0213 08:40:08.315869 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.315989 kubelet[2614]: E0213 08:40:08.315891 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:08.316305 kubelet[2614]: E0213 08:40:08.316281 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:08.316305 kubelet[2614]: W0213 08:40:08.316301 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:08.316509 kubelet[2614]: E0213 08:40:08.316328 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:09.249716 systemd[1]: Started sshd@82-145.40.67.89:22-139.178.68.195:48736.service. Feb 13 08:40:09.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.67.89:22-139.178.68.195:48736 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:09.264938 kubelet[2614]: E0213 08:40:09.264922 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:09.276498 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:09.276534 kernel: audit: type=1130 audit(1707813609.247:1807): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.67.89:22-139.178.68.195:48736 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:09.392000 audit[5492]: USER_ACCT pid=5492 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.393367 sshd[5492]: Accepted publickey for core from 139.178.68.195 port 48736 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:09.394558 sshd[5492]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:09.397001 systemd-logind[1463]: New session 65 of user core. Feb 13 08:40:09.397503 systemd[1]: Started session-65.scope. Feb 13 08:40:09.474951 sshd[5492]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:09.476416 systemd[1]: sshd@82-145.40.67.89:22-139.178.68.195:48736.service: Deactivated successfully. Feb 13 08:40:09.476838 systemd[1]: session-65.scope: Deactivated successfully. Feb 13 08:40:09.477203 systemd-logind[1463]: Session 65 logged out. Waiting for processes to exit. Feb 13 08:40:09.477684 systemd-logind[1463]: Removed session 65. Feb 13 08:40:09.392000 audit[5492]: CRED_ACQ pid=5492 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.485026 kernel: audit: type=1101 audit(1707813609.392:1808): pid=5492 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.485059 kernel: audit: type=1103 audit(1707813609.392:1809): pid=5492 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.634016 kernel: audit: type=1006 audit(1707813609.392:1810): pid=5492 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Feb 13 08:40:09.634053 kernel: audit: type=1300 audit(1707813609.392:1810): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc98ec040 a2=3 a3=0 items=0 ppid=1 pid=5492 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:09.392000 audit[5492]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc98ec040 a2=3 a3=0 items=0 ppid=1 pid=5492 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:09.392000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:09.756562 kernel: audit: type=1327 audit(1707813609.392:1810): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:09.756593 kernel: audit: type=1105 audit(1707813609.397:1811): pid=5492 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.397000 audit[5492]: USER_START pid=5492 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.851095 kernel: audit: type=1103 audit(1707813609.398:1812): pid=5494 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.398000 audit[5494]: CRED_ACQ pid=5494 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.473000 audit[5492]: USER_END pid=5492 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:10.035899 kernel: audit: type=1106 audit(1707813609.473:1813): pid=5492 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:10.035937 kernel: audit: type=1104 audit(1707813609.473:1814): pid=5492 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.473000 audit[5492]: CRED_DISP pid=5492 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:09.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-145.40.67.89:22-139.178.68.195:48736 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:10.454499 kubelet[2614]: E0213 08:40:10.454484 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:11.265493 kubelet[2614]: E0213 08:40:11.265396 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:11.651750 systemd[1]: Started sshd@83-145.40.67.89:22-43.153.15.221:41974.service. Feb 13 08:40:11.649000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.67.89:22-43.153.15.221:41974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:11.765133 sshd[5516]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:40:11.764000 audit[5516]: USER_AUTH pid=5516 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:40:13.264728 kubelet[2614]: E0213 08:40:13.264681 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:13.792179 sshd[5516]: Failed password for root from 43.153.15.221 port 41974 ssh2 Feb 13 08:40:13.971359 systemd[1]: Started sshd@84-145.40.67.89:22-161.35.108.241:47578.service. Feb 13 08:40:13.969000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.67.89:22-161.35.108.241:47578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:14.415751 sshd[5519]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:40:14.414000 audit[5519]: ANOM_LOGIN_FAILURES pid=5519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:14.416013 sshd[5519]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:40:14.443124 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:40:14.443219 kernel: audit: type=2100 audit(1707813614.414:1819): pid=5519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:14.478188 systemd[1]: Started sshd@85-145.40.67.89:22-139.178.68.195:48746.service. Feb 13 08:40:14.414000 audit[5519]: USER_AUTH pid=5519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:40:14.596556 kernel: audit: type=1100 audit(1707813614.414:1820): pid=5519 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:40:14.596588 kernel: audit: type=1130 audit(1707813614.477:1821): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.67.89:22-139.178.68.195:48746 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:14.477000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.67.89:22-139.178.68.195:48746 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:14.625223 sshd[5522]: Accepted publickey for core from 139.178.68.195 port 48746 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:14.627238 sshd[5522]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:14.629542 systemd-logind[1463]: New session 66 of user core. Feb 13 08:40:14.629992 systemd[1]: Started session-66.scope. Feb 13 08:40:14.624000 audit[5522]: USER_ACCT pid=5522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.685937 kernel: audit: type=1101 audit(1707813614.624:1822): pid=5522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.690774 sshd[5516]: Received disconnect from 43.153.15.221 port 41974:11: Bye Bye [preauth] Feb 13 08:40:14.690774 sshd[5516]: Disconnected from authenticating user root 43.153.15.221 port 41974 [preauth] Feb 13 08:40:14.691388 systemd[1]: sshd@83-145.40.67.89:22-43.153.15.221:41974.service: Deactivated successfully. Feb 13 08:40:14.708851 sshd[5522]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:14.710326 systemd[1]: sshd@85-145.40.67.89:22-139.178.68.195:48746.service: Deactivated successfully. Feb 13 08:40:14.710733 systemd[1]: session-66.scope: Deactivated successfully. Feb 13 08:40:14.711121 systemd-logind[1463]: Session 66 logged out. Waiting for processes to exit. Feb 13 08:40:14.711587 systemd-logind[1463]: Removed session 66. Feb 13 08:40:14.626000 audit[5522]: CRED_ACQ pid=5522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.867856 kernel: audit: type=1103 audit(1707813614.626:1823): pid=5522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.867890 kernel: audit: type=1006 audit(1707813614.626:1824): pid=5522 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Feb 13 08:40:14.926442 kernel: audit: type=1300 audit(1707813614.626:1824): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc20f4e7f0 a2=3 a3=0 items=0 ppid=1 pid=5522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:14.626000 audit[5522]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc20f4e7f0 a2=3 a3=0 items=0 ppid=1 pid=5522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:15.018740 kernel: audit: type=1327 audit(1707813614.626:1824): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:14.626000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:14.630000 audit[5522]: USER_START pid=5522 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:15.145012 kernel: audit: type=1105 audit(1707813614.630:1825): pid=5522 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:15.145043 kernel: audit: type=1103 audit(1707813614.630:1826): pid=5524 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.630000 audit[5524]: CRED_ACQ pid=5524 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.690000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-145.40.67.89:22-43.153.15.221:41974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:14.707000 audit[5522]: USER_END pid=5522 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.707000 audit[5522]: CRED_DISP pid=5522 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:14.709000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-145.40.67.89:22-139.178.68.195:48746 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:15.264261 kubelet[2614]: E0213 08:40:15.264233 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:15.456314 kubelet[2614]: E0213 08:40:15.456250 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:16.522318 sshd[5519]: Failed password for root from 161.35.108.241 port 47578 ssh2 Feb 13 08:40:17.264140 kubelet[2614]: E0213 08:40:17.264120 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:17.277806 kubelet[2614]: E0213 08:40:17.277793 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.277806 kubelet[2614]: W0213 08:40:17.277804 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.277917 kubelet[2614]: E0213 08:40:17.277819 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.277986 kubelet[2614]: E0213 08:40:17.277976 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.277986 kubelet[2614]: W0213 08:40:17.277984 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278072 kubelet[2614]: E0213 08:40:17.277996 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.278136 kubelet[2614]: E0213 08:40:17.278126 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.278136 kubelet[2614]: W0213 08:40:17.278135 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278230 kubelet[2614]: E0213 08:40:17.278147 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.278348 kubelet[2614]: E0213 08:40:17.278336 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.278348 kubelet[2614]: W0213 08:40:17.278346 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278444 kubelet[2614]: E0213 08:40:17.278359 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.278487 kubelet[2614]: E0213 08:40:17.278478 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.278538 kubelet[2614]: W0213 08:40:17.278486 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278538 kubelet[2614]: E0213 08:40:17.278499 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.278649 kubelet[2614]: E0213 08:40:17.278640 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.278649 kubelet[2614]: W0213 08:40:17.278647 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278737 kubelet[2614]: E0213 08:40:17.278658 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.278807 kubelet[2614]: E0213 08:40:17.278799 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.278864 kubelet[2614]: W0213 08:40:17.278806 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278864 kubelet[2614]: E0213 08:40:17.278819 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.278962 kubelet[2614]: E0213 08:40:17.278934 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.278962 kubelet[2614]: W0213 08:40:17.278942 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.278962 kubelet[2614]: E0213 08:40:17.278954 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.279103 kubelet[2614]: E0213 08:40:17.279094 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.279103 kubelet[2614]: W0213 08:40:17.279101 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.279207 kubelet[2614]: E0213 08:40:17.279113 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.279299 kubelet[2614]: E0213 08:40:17.279291 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.279299 kubelet[2614]: W0213 08:40:17.279298 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.279387 kubelet[2614]: E0213 08:40:17.279312 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.279456 kubelet[2614]: E0213 08:40:17.279448 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.279456 kubelet[2614]: W0213 08:40:17.279455 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.279573 kubelet[2614]: E0213 08:40:17.279467 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.279626 kubelet[2614]: E0213 08:40:17.279617 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.279626 kubelet[2614]: W0213 08:40:17.279625 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.279712 kubelet[2614]: E0213 08:40:17.279637 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.279758 kubelet[2614]: E0213 08:40:17.279745 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.279758 kubelet[2614]: W0213 08:40:17.279753 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.279843 kubelet[2614]: E0213 08:40:17.279767 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.279890 kubelet[2614]: E0213 08:40:17.279867 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.279890 kubelet[2614]: W0213 08:40:17.279875 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.279890 kubelet[2614]: E0213 08:40:17.279887 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.280030 kubelet[2614]: E0213 08:40:17.279992 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.280030 kubelet[2614]: W0213 08:40:17.280000 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.280030 kubelet[2614]: E0213 08:40:17.280012 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.280157 kubelet[2614]: E0213 08:40:17.280111 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.280157 kubelet[2614]: W0213 08:40:17.280119 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.280157 kubelet[2614]: E0213 08:40:17.280130 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.337587 sshd[5519]: Received disconnect from 161.35.108.241 port 47578:11: Bye Bye [preauth] Feb 13 08:40:17.337587 sshd[5519]: Disconnected from authenticating user root 161.35.108.241 port 47578 [preauth] Feb 13 08:40:17.340144 systemd[1]: sshd@84-145.40.67.89:22-161.35.108.241:47578.service: Deactivated successfully. Feb 13 08:40:17.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-145.40.67.89:22-161.35.108.241:47578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:17.379287 kubelet[2614]: E0213 08:40:17.379190 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.379287 kubelet[2614]: W0213 08:40:17.379234 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.379287 kubelet[2614]: E0213 08:40:17.379281 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.379911 kubelet[2614]: E0213 08:40:17.379876 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.380086 kubelet[2614]: W0213 08:40:17.379912 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.380086 kubelet[2614]: E0213 08:40:17.379981 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.380630 kubelet[2614]: E0213 08:40:17.380553 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.380630 kubelet[2614]: W0213 08:40:17.380589 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.380630 kubelet[2614]: E0213 08:40:17.380637 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.381201 kubelet[2614]: E0213 08:40:17.381127 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.381201 kubelet[2614]: W0213 08:40:17.381160 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.381201 kubelet[2614]: E0213 08:40:17.381207 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.381797 kubelet[2614]: E0213 08:40:17.381722 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.381797 kubelet[2614]: W0213 08:40:17.381756 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.382168 kubelet[2614]: E0213 08:40:17.381842 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.382349 kubelet[2614]: E0213 08:40:17.382295 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.382349 kubelet[2614]: W0213 08:40:17.382331 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.382584 kubelet[2614]: E0213 08:40:17.382377 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.382910 kubelet[2614]: E0213 08:40:17.382856 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.382910 kubelet[2614]: W0213 08:40:17.382881 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.383178 kubelet[2614]: E0213 08:40:17.382922 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.383516 kubelet[2614]: E0213 08:40:17.383433 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.383516 kubelet[2614]: W0213 08:40:17.383470 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.383516 kubelet[2614]: E0213 08:40:17.383518 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.384221 kubelet[2614]: E0213 08:40:17.384184 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.384221 kubelet[2614]: W0213 08:40:17.384219 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.384565 kubelet[2614]: E0213 08:40:17.384266 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.384853 kubelet[2614]: E0213 08:40:17.384821 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.384853 kubelet[2614]: W0213 08:40:17.384848 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.385132 kubelet[2614]: E0213 08:40:17.384896 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.385512 kubelet[2614]: E0213 08:40:17.385474 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.385652 kubelet[2614]: W0213 08:40:17.385510 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.385652 kubelet[2614]: E0213 08:40:17.385557 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:17.386123 kubelet[2614]: E0213 08:40:17.386090 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:17.386123 kubelet[2614]: W0213 08:40:17.386118 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:17.386375 kubelet[2614]: E0213 08:40:17.386155 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.264332 kubelet[2614]: E0213 08:40:19.264288 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:19.294554 kubelet[2614]: E0213 08:40:19.294442 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.294554 kubelet[2614]: W0213 08:40:19.294487 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.294554 kubelet[2614]: E0213 08:40:19.294534 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.295124 kubelet[2614]: E0213 08:40:19.295056 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.295124 kubelet[2614]: W0213 08:40:19.295080 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.295124 kubelet[2614]: E0213 08:40:19.295117 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.295696 kubelet[2614]: E0213 08:40:19.295606 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.295696 kubelet[2614]: W0213 08:40:19.295639 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.295696 kubelet[2614]: E0213 08:40:19.295678 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.296414 kubelet[2614]: E0213 08:40:19.296324 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.296414 kubelet[2614]: W0213 08:40:19.296359 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.296414 kubelet[2614]: E0213 08:40:19.296398 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.297035 kubelet[2614]: E0213 08:40:19.296947 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.297035 kubelet[2614]: W0213 08:40:19.296987 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.297035 kubelet[2614]: E0213 08:40:19.297027 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.297642 kubelet[2614]: E0213 08:40:19.297552 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.297642 kubelet[2614]: W0213 08:40:19.297585 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.297642 kubelet[2614]: E0213 08:40:19.297625 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.298241 kubelet[2614]: E0213 08:40:19.298152 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.298241 kubelet[2614]: W0213 08:40:19.298178 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.298241 kubelet[2614]: E0213 08:40:19.298213 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.298755 kubelet[2614]: E0213 08:40:19.298700 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.298755 kubelet[2614]: W0213 08:40:19.298732 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.299009 kubelet[2614]: E0213 08:40:19.298771 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.395802 kubelet[2614]: E0213 08:40:19.395693 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.395802 kubelet[2614]: W0213 08:40:19.395735 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.395802 kubelet[2614]: E0213 08:40:19.395782 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.396533 kubelet[2614]: E0213 08:40:19.396442 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.396533 kubelet[2614]: W0213 08:40:19.396475 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.396533 kubelet[2614]: E0213 08:40:19.396515 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.397076 kubelet[2614]: E0213 08:40:19.397042 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.397076 kubelet[2614]: W0213 08:40:19.397069 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.397398 kubelet[2614]: E0213 08:40:19.397104 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.397678 kubelet[2614]: E0213 08:40:19.397604 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.397678 kubelet[2614]: W0213 08:40:19.397637 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.397678 kubelet[2614]: E0213 08:40:19.397678 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.398309 kubelet[2614]: E0213 08:40:19.398272 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.398309 kubelet[2614]: W0213 08:40:19.398306 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.398590 kubelet[2614]: E0213 08:40:19.398347 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.399360 kubelet[2614]: E0213 08:40:19.399262 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:19.399360 kubelet[2614]: W0213 08:40:19.399296 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:19.399360 kubelet[2614]: E0213 08:40:19.399339 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:19.718308 systemd[1]: Started sshd@86-145.40.67.89:22-139.178.68.195:47638.service. Feb 13 08:40:19.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.67.89:22-139.178.68.195:47638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:19.745122 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:40:19.745204 kernel: audit: type=1130 audit(1707813619.717:1832): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.67.89:22-139.178.68.195:47638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:19.863000 audit[5593]: USER_ACCT pid=5593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.864951 sshd[5593]: Accepted publickey for core from 139.178.68.195 port 47638 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:19.869221 sshd[5593]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:19.878823 systemd-logind[1463]: New session 67 of user core. Feb 13 08:40:19.881097 systemd[1]: Started session-67.scope. Feb 13 08:40:19.867000 audit[5593]: CRED_ACQ pid=5593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.967258 sshd[5593]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:19.968564 systemd[1]: sshd@86-145.40.67.89:22-139.178.68.195:47638.service: Deactivated successfully. Feb 13 08:40:19.968984 systemd[1]: session-67.scope: Deactivated successfully. Feb 13 08:40:19.969343 systemd-logind[1463]: Session 67 logged out. Waiting for processes to exit. Feb 13 08:40:19.969820 systemd-logind[1463]: Removed session 67. Feb 13 08:40:20.046604 kernel: audit: type=1101 audit(1707813619.863:1833): pid=5593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:20.046638 kernel: audit: type=1103 audit(1707813619.867:1834): pid=5593 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:20.046652 kernel: audit: type=1006 audit(1707813619.867:1835): pid=5593 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=67 res=1 Feb 13 08:40:20.105224 kernel: audit: type=1300 audit(1707813619.867:1835): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8e24bbc0 a2=3 a3=0 items=0 ppid=1 pid=5593 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:19.867000 audit[5593]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc8e24bbc0 a2=3 a3=0 items=0 ppid=1 pid=5593 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:20.197256 kernel: audit: type=1327 audit(1707813619.867:1835): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:19.867000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:20.227775 kernel: audit: type=1105 audit(1707813619.889:1836): pid=5593 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.889000 audit[5593]: USER_START pid=5593 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:20.322277 kernel: audit: type=1103 audit(1707813619.890:1837): pid=5595 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.890000 audit[5595]: CRED_ACQ pid=5595 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:20.411315 kernel: audit: type=1106 audit(1707813619.966:1838): pid=5593 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.966000 audit[5593]: USER_END pid=5593 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:20.456774 kubelet[2614]: E0213 08:40:20.456765 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:20.506754 kernel: audit: type=1104 audit(1707813619.966:1839): pid=5593 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.966000 audit[5593]: CRED_DISP pid=5593 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:19.967000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-145.40.67.89:22-139.178.68.195:47638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:21.265148 kubelet[2614]: E0213 08:40:21.265105 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:23.264993 kubelet[2614]: E0213 08:40:23.264881 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:24.976356 systemd[1]: Started sshd@87-145.40.67.89:22-139.178.68.195:47644.service. Feb 13 08:40:24.975000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.67.89:22-139.178.68.195:47644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:25.003229 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:25.003282 kernel: audit: type=1130 audit(1707813624.975:1841): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.67.89:22-139.178.68.195:47644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:25.120000 audit[5620]: USER_ACCT pid=5620 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.121398 sshd[5620]: Accepted publickey for core from 139.178.68.195 port 47644 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:25.122214 sshd[5620]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:25.124525 systemd-logind[1463]: New session 68 of user core. Feb 13 08:40:25.125035 systemd[1]: Started session-68.scope. Feb 13 08:40:25.205890 sshd[5620]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:25.207382 systemd[1]: sshd@87-145.40.67.89:22-139.178.68.195:47644.service: Deactivated successfully. Feb 13 08:40:25.207789 systemd[1]: session-68.scope: Deactivated successfully. Feb 13 08:40:25.208163 systemd-logind[1463]: Session 68 logged out. Waiting for processes to exit. Feb 13 08:40:25.208731 systemd-logind[1463]: Removed session 68. Feb 13 08:40:25.121000 audit[5620]: CRED_ACQ pid=5620 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.264841 kubelet[2614]: E0213 08:40:25.264783 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:25.303558 kernel: audit: type=1101 audit(1707813625.120:1842): pid=5620 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.303593 kernel: audit: type=1103 audit(1707813625.121:1843): pid=5620 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.303610 kernel: audit: type=1006 audit(1707813625.121:1844): pid=5620 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=68 res=1 Feb 13 08:40:25.361676 kernel: audit: type=1300 audit(1707813625.121:1844): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc540d760 a2=3 a3=0 items=0 ppid=1 pid=5620 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:25.121000 audit[5620]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc540d760 a2=3 a3=0 items=0 ppid=1 pid=5620 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:25.453478 kernel: audit: type=1327 audit(1707813625.121:1844): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:25.121000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:25.457947 kubelet[2614]: E0213 08:40:25.457939 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:25.484008 kernel: audit: type=1105 audit(1707813625.126:1845): pid=5620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.126000 audit[5620]: USER_START pid=5620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.578439 kernel: audit: type=1103 audit(1707813625.127:1846): pid=5622 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.127000 audit[5622]: CRED_ACQ pid=5622 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.667644 kernel: audit: type=1106 audit(1707813625.205:1847): pid=5620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.205000 audit[5620]: USER_END pid=5620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.763151 kernel: audit: type=1104 audit(1707813625.205:1848): pid=5620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.205000 audit[5620]: CRED_DISP pid=5620 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:25.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-145.40.67.89:22-139.178.68.195:47644 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:27.264735 kubelet[2614]: E0213 08:40:27.264717 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:29.264523 kubelet[2614]: E0213 08:40:29.264475 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:30.216232 systemd[1]: Started sshd@88-145.40.67.89:22-139.178.68.195:36050.service. Feb 13 08:40:30.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.67.89:22-139.178.68.195:36050 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:30.243208 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:30.243250 kernel: audit: type=1130 audit(1707813630.215:1850): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.67.89:22-139.178.68.195:36050 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:30.360000 audit[5645]: USER_ACCT pid=5645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.361405 sshd[5645]: Accepted publickey for core from 139.178.68.195 port 36050 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:30.363222 sshd[5645]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:30.365818 systemd-logind[1463]: New session 69 of user core. Feb 13 08:40:30.366311 systemd[1]: Started session-69.scope. Feb 13 08:40:30.446011 sshd[5645]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:30.447440 systemd[1]: sshd@88-145.40.67.89:22-139.178.68.195:36050.service: Deactivated successfully. Feb 13 08:40:30.447843 systemd[1]: session-69.scope: Deactivated successfully. Feb 13 08:40:30.448268 systemd-logind[1463]: Session 69 logged out. Waiting for processes to exit. Feb 13 08:40:30.448808 systemd-logind[1463]: Removed session 69. Feb 13 08:40:30.362000 audit[5645]: CRED_ACQ pid=5645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.459240 kubelet[2614]: E0213 08:40:30.459198 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:30.543430 kernel: audit: type=1101 audit(1707813630.360:1851): pid=5645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.543466 kernel: audit: type=1103 audit(1707813630.362:1852): pid=5645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.543481 kernel: audit: type=1006 audit(1707813630.362:1853): pid=5645 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=69 res=1 Feb 13 08:40:30.602013 kernel: audit: type=1300 audit(1707813630.362:1853): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5e3318a0 a2=3 a3=0 items=0 ppid=1 pid=5645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:30.362000 audit[5645]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5e3318a0 a2=3 a3=0 items=0 ppid=1 pid=5645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:30.694082 kernel: audit: type=1327 audit(1707813630.362:1853): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:30.362000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:30.724624 kernel: audit: type=1105 audit(1707813630.367:1854): pid=5645 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.367000 audit[5645]: USER_START pid=5645 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.819104 kernel: audit: type=1103 audit(1707813630.368:1855): pid=5647 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.368000 audit[5647]: CRED_ACQ pid=5647 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.445000 audit[5645]: USER_END pid=5645 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:31.003896 kernel: audit: type=1106 audit(1707813630.445:1856): pid=5645 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:31.003935 kernel: audit: type=1104 audit(1707813630.445:1857): pid=5645 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.445000 audit[5645]: CRED_DISP pid=5645 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:30.446000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-145.40.67.89:22-139.178.68.195:36050 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:31.264221 kubelet[2614]: E0213 08:40:31.264176 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:33.265264 kubelet[2614]: E0213 08:40:33.265153 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:35.265456 kubelet[2614]: E0213 08:40:35.265354 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:35.457187 systemd[1]: Started sshd@89-145.40.67.89:22-139.178.68.195:36066.service. Feb 13 08:40:35.456000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.67.89:22-139.178.68.195:36066 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:35.460619 kubelet[2614]: E0213 08:40:35.460566 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:35.500821 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:35.500894 kernel: audit: type=1130 audit(1707813635.456:1859): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.67.89:22-139.178.68.195:36066 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:35.610000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.621989 sshd[5670]: Accepted publickey for core from 139.178.68.195 port 36066 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:35.625851 sshd[5670]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:35.635620 systemd-logind[1463]: New session 70 of user core. Feb 13 08:40:35.638866 systemd[1]: Started session-70.scope. Feb 13 08:40:35.610000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001a64e10 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:35.705038 kernel: audit: type=1400 audit(1707813635.610:1860): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.705077 kernel: audit: type=1300 audit(1707813635.610:1860): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001a64e10 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:35.778751 sshd[5670]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:35.780392 systemd[1]: sshd@89-145.40.67.89:22-139.178.68.195:36066.service: Deactivated successfully. Feb 13 08:40:35.781155 systemd[1]: session-70.scope: Deactivated successfully. Feb 13 08:40:35.781717 systemd-logind[1463]: Session 70 logged out. Waiting for processes to exit. Feb 13 08:40:35.782183 systemd-logind[1463]: Removed session 70. Feb 13 08:40:35.610000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:35.916935 kernel: audit: type=1327 audit(1707813635.610:1860): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:35.916973 kernel: audit: type=1400 audit(1707813635.610:1861): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.610000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:36.006239 kernel: audit: type=1300 audit(1707813635.610:1861): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b44c20 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:35.610000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b44c20 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:36.126122 kernel: audit: type=1327 audit(1707813635.610:1861): proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:35.610000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:36.219325 kernel: audit: type=1101 audit(1707813635.620:1862): pid=5670 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:35.620000 audit[5670]: USER_ACCT pid=5670 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:35.624000 audit[5670]: CRED_ACQ pid=5670 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:36.402741 kernel: audit: type=1103 audit(1707813635.624:1863): pid=5670 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:36.402792 kernel: audit: type=1006 audit(1707813635.624:1864): pid=5670 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=70 res=1 Feb 13 08:40:35.624000 audit[5670]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd8a55cc0 a2=3 a3=0 items=0 ppid=1 pid=5670 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:35.624000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:35.646000 audit[5670]: USER_START pid=5670 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:35.647000 audit[5672]: CRED_ACQ pid=5672 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:35.778000 audit[5670]: USER_END pid=5670 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:35.778000 audit[5670]: CRED_DISP pid=5670 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:35.779000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-145.40.67.89:22-139.178.68.195:36066 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:35.894000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.894000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c011d413b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:40:35.894000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:40:35.894000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.894000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00e6c2000 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:40:35.894000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:40:35.894000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.894000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c002e85280 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:40:35.894000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:40:35.894000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.894000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c004f2f4a0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:40:35.894000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:40:35.894000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.894000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=65 a1=c011d41410 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:40:35.894000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:40:35.894000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:35.894000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0118ca480 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:40:35.894000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:40:37.264894 kubelet[2614]: E0213 08:40:37.264802 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:39.265075 kubelet[2614]: E0213 08:40:39.265030 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:40.462141 kubelet[2614]: E0213 08:40:40.462045 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:40.788255 systemd[1]: Started sshd@90-145.40.67.89:22-139.178.68.195:58290.service. Feb 13 08:40:40.787000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.67.89:22-139.178.68.195:58290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:40.815434 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:40:40.815523 kernel: audit: type=1130 audit(1707813640.787:1876): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.67.89:22-139.178.68.195:58290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:40.934000 audit[5697]: USER_ACCT pid=5697 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:40.935965 sshd[5697]: Accepted publickey for core from 139.178.68.195 port 58290 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:40.938956 sshd[5697]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:40.944210 systemd-logind[1463]: New session 71 of user core. Feb 13 08:40:40.945232 systemd[1]: Started session-71.scope. Feb 13 08:40:41.023805 sshd[5697]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:41.025719 systemd[1]: sshd@90-145.40.67.89:22-139.178.68.195:58290.service: Deactivated successfully. Feb 13 08:40:41.026146 systemd[1]: session-71.scope: Deactivated successfully. Feb 13 08:40:41.026557 systemd-logind[1463]: Session 71 logged out. Waiting for processes to exit. Feb 13 08:40:41.027071 systemd-logind[1463]: Removed session 71. Feb 13 08:40:40.937000 audit[5697]: CRED_ACQ pid=5697 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.118832 kernel: audit: type=1101 audit(1707813640.934:1877): pid=5697 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.118865 kernel: audit: type=1103 audit(1707813640.937:1878): pid=5697 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.118881 kernel: audit: type=1006 audit(1707813640.937:1879): pid=5697 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=71 res=1 Feb 13 08:40:41.177592 kernel: audit: type=1300 audit(1707813640.937:1879): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd1a00330 a2=3 a3=0 items=0 ppid=1 pid=5697 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:40.937000 audit[5697]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd1a00330 a2=3 a3=0 items=0 ppid=1 pid=5697 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:41.264152 kubelet[2614]: E0213 08:40:41.264113 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:41.269849 kernel: audit: type=1327 audit(1707813640.937:1879): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:40.937000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:41.300707 kernel: audit: type=1105 audit(1707813640.946:1880): pid=5697 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:40.946000 audit[5697]: USER_START pid=5697 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.347186 kubelet[2614]: E0213 08:40:41.347148 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:41.347186 kubelet[2614]: W0213 08:40:41.347156 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:41.347186 kubelet[2614]: E0213 08:40:41.347166 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:41.347331 kubelet[2614]: E0213 08:40:41.347278 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:41.347331 kubelet[2614]: W0213 08:40:41.347285 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:41.347331 kubelet[2614]: E0213 08:40:41.347292 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:41.347410 kubelet[2614]: E0213 08:40:41.347401 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:41.347410 kubelet[2614]: W0213 08:40:41.347407 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:41.347449 kubelet[2614]: E0213 08:40:41.347414 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:41.347567 kubelet[2614]: E0213 08:40:41.347532 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:41.347567 kubelet[2614]: W0213 08:40:41.347539 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:41.347567 kubelet[2614]: E0213 08:40:41.347546 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:41.396547 kernel: audit: type=1103 audit(1707813640.947:1881): pid=5699 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:40.947000 audit[5699]: CRED_ACQ pid=5699 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.486341 kernel: audit: type=1400 audit(1707813640.951:1882): avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:41.576528 kernel: audit: type=1300 audit(1707813640.951:1882): arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0023532a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0023532a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002b45780 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:40.951000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:40.951000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002152ce0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:40.951000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:40.952000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:40:40.952000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b458c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:40:40.952000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:40:41.023000 audit[5697]: USER_END pid=5697 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.023000 audit[5697]: CRED_DISP pid=5697 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:41.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-145.40.67.89:22-139.178.68.195:58290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:43.264420 kubelet[2614]: E0213 08:40:43.264373 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:45.264985 kubelet[2614]: E0213 08:40:45.264963 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:45.464283 kubelet[2614]: E0213 08:40:45.464221 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:46.033109 systemd[1]: Started sshd@91-145.40.67.89:22-139.178.68.195:51276.service. Feb 13 08:40:46.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.67.89:22-139.178.68.195:51276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:46.060592 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:40:46.060659 kernel: audit: type=1130 audit(1707813646.032:1889): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.67.89:22-139.178.68.195:51276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:46.177000 audit[5726]: USER_ACCT pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.178552 sshd[5726]: Accepted publickey for core from 139.178.68.195 port 51276 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:46.179285 sshd[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:46.181909 systemd-logind[1463]: New session 72 of user core. Feb 13 08:40:46.183022 systemd[1]: Started session-72.scope. Feb 13 08:40:46.261736 sshd[5726]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:46.263423 systemd[1]: sshd@91-145.40.67.89:22-139.178.68.195:51276.service: Deactivated successfully. Feb 13 08:40:46.264177 systemd[1]: session-72.scope: Deactivated successfully. Feb 13 08:40:46.264748 systemd-logind[1463]: Session 72 logged out. Waiting for processes to exit. Feb 13 08:40:46.265307 systemd-logind[1463]: Removed session 72. Feb 13 08:40:46.178000 audit[5726]: CRED_ACQ pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.360863 kernel: audit: type=1101 audit(1707813646.177:1890): pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.360902 kernel: audit: type=1103 audit(1707813646.178:1891): pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.360919 kernel: audit: type=1006 audit(1707813646.178:1892): pid=5726 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=72 res=1 Feb 13 08:40:46.419459 kernel: audit: type=1300 audit(1707813646.178:1892): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4f1b1740 a2=3 a3=0 items=0 ppid=1 pid=5726 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:46.178000 audit[5726]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4f1b1740 a2=3 a3=0 items=0 ppid=1 pid=5726 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:46.511524 kernel: audit: type=1327 audit(1707813646.178:1892): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:46.178000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:46.182000 audit[5726]: USER_START pid=5726 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.636791 kernel: audit: type=1105 audit(1707813646.182:1893): pid=5726 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.636824 kernel: audit: type=1103 audit(1707813646.185:1894): pid=5728 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.185000 audit[5728]: CRED_ACQ pid=5728 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.259000 audit[5726]: USER_END pid=5726 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.821632 kernel: audit: type=1106 audit(1707813646.259:1895): pid=5726 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.821670 kernel: audit: type=1104 audit(1707813646.259:1896): pid=5726 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.259000 audit[5726]: CRED_DISP pid=5726 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:46.259000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-145.40.67.89:22-139.178.68.195:51276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:47.264880 kubelet[2614]: E0213 08:40:47.264782 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:49.264938 kubelet[2614]: E0213 08:40:49.264891 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:50.466323 kubelet[2614]: E0213 08:40:50.466226 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:51.264739 kubelet[2614]: E0213 08:40:51.264633 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:51.271187 systemd[1]: Started sshd@92-145.40.67.89:22-139.178.68.195:51284.service. Feb 13 08:40:51.270000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.67.89:22-139.178.68.195:51284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:51.298418 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:51.298481 kernel: audit: type=1130 audit(1707813651.270:1898): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.67.89:22-139.178.68.195:51284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:51.416000 audit[5750]: USER_ACCT pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.417379 sshd[5750]: Accepted publickey for core from 139.178.68.195 port 51284 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:51.418256 sshd[5750]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:51.420886 systemd-logind[1463]: New session 73 of user core. Feb 13 08:40:51.421522 systemd[1]: Started session-73.scope. Feb 13 08:40:51.499874 sshd[5750]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:51.501401 systemd[1]: sshd@92-145.40.67.89:22-139.178.68.195:51284.service: Deactivated successfully. Feb 13 08:40:51.501825 systemd[1]: session-73.scope: Deactivated successfully. Feb 13 08:40:51.502227 systemd-logind[1463]: Session 73 logged out. Waiting for processes to exit. Feb 13 08:40:51.502706 systemd-logind[1463]: Removed session 73. Feb 13 08:40:51.417000 audit[5750]: CRED_ACQ pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.599384 kernel: audit: type=1101 audit(1707813651.416:1899): pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.599425 kernel: audit: type=1103 audit(1707813651.417:1900): pid=5750 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.599440 kernel: audit: type=1006 audit(1707813651.417:1901): pid=5750 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=73 res=1 Feb 13 08:40:51.658010 kernel: audit: type=1300 audit(1707813651.417:1901): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd51e4c520 a2=3 a3=0 items=0 ppid=1 pid=5750 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:51.417000 audit[5750]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd51e4c520 a2=3 a3=0 items=0 ppid=1 pid=5750 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:51.750010 kernel: audit: type=1327 audit(1707813651.417:1901): proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:51.417000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:51.780553 kernel: audit: type=1105 audit(1707813651.422:1902): pid=5750 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.422000 audit[5750]: USER_START pid=5750 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.875133 kernel: audit: type=1103 audit(1707813651.423:1903): pid=5755 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.423000 audit[5755]: CRED_ACQ pid=5755 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.964364 kernel: audit: type=1106 audit(1707813651.499:1904): pid=5750 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.499000 audit[5750]: USER_END pid=5750 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:52.059900 kernel: audit: type=1104 audit(1707813651.499:1905): pid=5750 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.499000 audit[5750]: CRED_DISP pid=5750 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:51.500000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-145.40.67.89:22-139.178.68.195:51284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:53.264948 kubelet[2614]: E0213 08:40:53.264930 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:55.264810 kubelet[2614]: E0213 08:40:55.264789 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:55.468063 kubelet[2614]: E0213 08:40:55.467998 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:40:56.402110 env[1475]: time="2024-02-13T08:40:56.402082811Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:40:56.402869 env[1475]: time="2024-02-13T08:40:56.402827719Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:40:56.403911 env[1475]: time="2024-02-13T08:40:56.403899674Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:40:56.404912 env[1475]: time="2024-02-13T08:40:56.404899060Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:5f2d3b8c354a4eb6de46e786889913916e620c6c256982fb8d0f1a1d36a282bc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:40:56.405480 env[1475]: time="2024-02-13T08:40:56.405449368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.27.0\" returns image reference \"sha256:b33768e0da1f8a5788a6a5d8ac2dcf15292ea9f3717de450f946c0a055b3532c\"" Feb 13 08:40:56.405915 env[1475]: time="2024-02-13T08:40:56.405902686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\"" Feb 13 08:40:56.409400 env[1475]: time="2024-02-13T08:40:56.409382266Z" level=info msg="CreateContainer within sandbox \"1a7db042edb6eca94ac1ae5029cc56d7a3eda26bfaf7e19b9b0ea9581cbe6c99\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 08:40:56.413731 env[1475]: time="2024-02-13T08:40:56.413695308Z" level=info msg="CreateContainer within sandbox \"1a7db042edb6eca94ac1ae5029cc56d7a3eda26bfaf7e19b9b0ea9581cbe6c99\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5cb715c2b6e18f28cdf0f4bff8438a9fbe7b7b3c7b9f73dea5da056b57373500\"" Feb 13 08:40:56.414008 env[1475]: time="2024-02-13T08:40:56.413952367Z" level=info msg="StartContainer for \"5cb715c2b6e18f28cdf0f4bff8438a9fbe7b7b3c7b9f73dea5da056b57373500\"" Feb 13 08:40:56.421977 systemd[1]: Started cri-containerd-5cb715c2b6e18f28cdf0f4bff8438a9fbe7b7b3c7b9f73dea5da056b57373500.scope. Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.456989 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:40:56.457071 kernel: audit: type=1400 audit(1707813656.428:1907): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.522890 systemd[1]: Started sshd@93-145.40.67.89:22-139.178.68.195:52812.service. Feb 13 08:40:56.582987 kernel: audit: type=1400 audit(1707813656.428:1908): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.583022 kernel: audit: type=1400 audit(1707813656.428:1909): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.645709 kernel: audit: type=1400 audit(1707813656.428:1910): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.673986 sshd[5804]: Accepted publickey for core from 139.178.68.195 port 52812 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:40:56.675242 sshd[5804]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:40:56.677800 systemd-logind[1463]: New session 74 of user core. Feb 13 08:40:56.678375 systemd[1]: Started session-74.scope. Feb 13 08:40:56.709251 kernel: audit: type=1400 audit(1707813656.428:1911): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.756520 sshd[5804]: pam_unix(sshd:session): session closed for user core Feb 13 08:40:56.757951 systemd[1]: sshd@93-145.40.67.89:22-139.178.68.195:52812.service: Deactivated successfully. Feb 13 08:40:56.758378 systemd[1]: session-74.scope: Deactivated successfully. Feb 13 08:40:56.758771 systemd-logind[1463]: Session 74 logged out. Waiting for processes to exit. Feb 13 08:40:56.759432 systemd-logind[1463]: Removed session 74. Feb 13 08:40:56.772709 kernel: audit: type=1400 audit(1707813656.428:1912): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.836228 kernel: audit: type=1400 audit(1707813656.428:1913): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.899720 kernel: audit: type=1400 audit(1707813656.428:1914): avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { perfmon } for pid=1 comm="systemd" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.963173 kernel: audit: type=1400 audit(1707813656.428:1915): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.428000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:57.026272 kernel: audit: type=1400 audit(1707813656.519:1916): avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[1]: AVC avc: denied { bpf } for pid=1 comm="systemd" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit: BPF prog-id=135 op=LOAD Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=c000145c48 a2=10 a3=1c items=0 ppid=3245 pid=5787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:56.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563623731356332623665313866323863646630663462666638343338 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=0 a1=c0001456b0 a2=3c a3=8 items=0 ppid=3245 pid=5787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:56.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563623731356332623665313866323863646630663462666638343338 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.522000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.67.89:22-139.178.68.195:52812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:56.519000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.519000 audit: BPF prog-id=136 op=LOAD Feb 13 08:40:56.519000 audit[5787]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c0001459d8 a2=78 a3=c00027fdc0 items=0 ppid=3245 pid=5787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:56.519000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563623731356332623665313866323863646630663462666638343338 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.673000 audit[5804]: USER_ACCT pid=5804 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:56.674000 audit[5804]: CRED_ACQ pid=5804 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:56.674000 audit[5804]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffd44f8bf0 a2=3 a3=0 items=0 ppid=1 pid=5804 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:56.674000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:40:56.679000 audit[5804]: USER_START pid=5804 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:56.679000 audit[5806]: CRED_ACQ pid=5806 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:56.756000 audit[5804]: USER_END pid=5804 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:56.756000 audit[5804]: CRED_DISP pid=5804 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:40:56.757000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-145.40.67.89:22-139.178.68.195:52812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:40:56.644000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.644000 audit: BPF prog-id=137 op=LOAD Feb 13 08:40:56.644000 audit[5787]: SYSCALL arch=c000003e syscall=321 success=yes exit=18 a0=5 a1=c000145770 a2=78 a3=c00027fe08 items=0 ppid=3245 pid=5787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:56.644000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563623731356332623665313866323863646630663462666638343338 Feb 13 08:40:56.771000 audit: BPF prog-id=137 op=UNLOAD Feb 13 08:40:56.771000 audit: BPF prog-id=136 op=UNLOAD Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { perfmon } for pid=5787 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit[5787]: AVC avc: denied { bpf } for pid=5787 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:40:56.771000 audit: BPF prog-id=138 op=LOAD Feb 13 08:40:56.771000 audit[5787]: SYSCALL arch=c000003e syscall=321 success=yes exit=16 a0=5 a1=c000145c30 a2=78 a3=c0002910a8 items=0 ppid=3245 pid=5787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:56.771000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563623731356332623665313866323863646630663462666638343338 Feb 13 08:40:57.106703 env[1475]: time="2024-02-13T08:40:57.106653228Z" level=info msg="StartContainer for \"5cb715c2b6e18f28cdf0f4bff8438a9fbe7b7b3c7b9f73dea5da056b57373500\" returns successfully" Feb 13 08:40:57.264622 kubelet[2614]: E0213 08:40:57.264413 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:57.904903 kubelet[2614]: I0213 08:40:57.904833 2614 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-5c7b7d8757-mv8pm" podStartSLOduration=-9.223371377950024e+09 pod.CreationTimestamp="2024-02-13 08:29:59 +0000 UTC" firstStartedPulling="2024-02-13 08:29:59.885333344 +0000 UTC m=+19.688225180" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-02-13 08:40:57.90347378 +0000 UTC m=+677.706365684" watchObservedRunningTime="2024-02-13 08:40:57.9047516 +0000 UTC m=+677.707643473" Feb 13 08:40:57.945000 audit[5879]: NETFILTER_CFG table=filter:109 family=2 entries=13 op=nft_register_rule pid=5879 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:40:57.945000 audit[5879]: SYSCALL arch=c000003e syscall=46 success=yes exit=4028 a0=3 a1=7ffef7600e30 a2=0 a3=7ffef7600e1c items=0 ppid=2919 pid=5879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:57.945000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:40:57.961962 kubelet[2614]: E0213 08:40:57.961894 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.962217 kubelet[2614]: W0213 08:40:57.962086 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.962217 kubelet[2614]: E0213 08:40:57.962164 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.962714 kubelet[2614]: E0213 08:40:57.962674 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.962714 kubelet[2614]: W0213 08:40:57.962705 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.963077 kubelet[2614]: E0213 08:40:57.962754 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.963267 kubelet[2614]: E0213 08:40:57.963193 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.963267 kubelet[2614]: W0213 08:40:57.963225 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.963267 kubelet[2614]: E0213 08:40:57.963270 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.963818 kubelet[2614]: E0213 08:40:57.963781 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.963818 kubelet[2614]: W0213 08:40:57.963810 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.964172 kubelet[2614]: E0213 08:40:57.963858 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.964385 kubelet[2614]: E0213 08:40:57.964350 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.964385 kubelet[2614]: W0213 08:40:57.964378 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.964736 kubelet[2614]: E0213 08:40:57.964424 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.964938 kubelet[2614]: E0213 08:40:57.964892 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.965123 kubelet[2614]: W0213 08:40:57.964937 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.965123 kubelet[2614]: E0213 08:40:57.964986 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.965670 kubelet[2614]: E0213 08:40:57.965625 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.965670 kubelet[2614]: W0213 08:40:57.965662 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.966142 kubelet[2614]: E0213 08:40:57.965715 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.966336 kubelet[2614]: E0213 08:40:57.966156 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.966336 kubelet[2614]: W0213 08:40:57.966187 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.966336 kubelet[2614]: E0213 08:40:57.966235 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.966785 kubelet[2614]: E0213 08:40:57.966658 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.966785 kubelet[2614]: W0213 08:40:57.966689 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.966785 kubelet[2614]: E0213 08:40:57.966734 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.967327 kubelet[2614]: E0213 08:40:57.967143 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.967327 kubelet[2614]: W0213 08:40:57.967172 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.967327 kubelet[2614]: E0213 08:40:57.967217 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.967753 kubelet[2614]: E0213 08:40:57.967668 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.967753 kubelet[2614]: W0213 08:40:57.967698 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.967753 kubelet[2614]: E0213 08:40:57.967740 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.968264 kubelet[2614]: E0213 08:40:57.968146 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:57.968264 kubelet[2614]: W0213 08:40:57.968179 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:57.968264 kubelet[2614]: E0213 08:40:57.968225 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:57.946000 audit[5879]: NETFILTER_CFG table=nat:110 family=2 entries=27 op=nft_register_chain pid=5879 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:40:57.946000 audit[5879]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffef7600e30 a2=0 a3=7ffef7600e1c items=0 ppid=2919 pid=5879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:40:57.946000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:40:58.062316 kubelet[2614]: E0213 08:40:58.062247 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.062316 kubelet[2614]: W0213 08:40:58.062296 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.062849 kubelet[2614]: E0213 08:40:58.062358 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.063094 kubelet[2614]: E0213 08:40:58.063014 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.063094 kubelet[2614]: W0213 08:40:58.063052 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.063494 kubelet[2614]: E0213 08:40:58.063118 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.063745 kubelet[2614]: E0213 08:40:58.063705 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.063745 kubelet[2614]: W0213 08:40:58.063738 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.064142 kubelet[2614]: E0213 08:40:58.063803 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.064522 kubelet[2614]: E0213 08:40:58.064428 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.064522 kubelet[2614]: W0213 08:40:58.064464 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.064522 kubelet[2614]: E0213 08:40:58.064511 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.065121 kubelet[2614]: E0213 08:40:58.065043 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.065121 kubelet[2614]: W0213 08:40:58.065070 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.065448 kubelet[2614]: E0213 08:40:58.065212 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.065616 kubelet[2614]: E0213 08:40:58.065570 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.065616 kubelet[2614]: W0213 08:40:58.065605 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.065960 kubelet[2614]: E0213 08:40:58.065737 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.066214 kubelet[2614]: E0213 08:40:58.066140 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.066214 kubelet[2614]: W0213 08:40:58.066173 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.066569 kubelet[2614]: E0213 08:40:58.066307 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.066732 kubelet[2614]: E0213 08:40:58.066691 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.066732 kubelet[2614]: W0213 08:40:58.066722 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.067007 kubelet[2614]: E0213 08:40:58.066768 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.067416 kubelet[2614]: E0213 08:40:58.067368 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.067416 kubelet[2614]: W0213 08:40:58.067402 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.067760 kubelet[2614]: E0213 08:40:58.067538 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.067909 kubelet[2614]: E0213 08:40:58.067892 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.068108 kubelet[2614]: W0213 08:40:58.067917 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.068108 kubelet[2614]: E0213 08:40:58.068017 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.068473 kubelet[2614]: E0213 08:40:58.068427 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.068473 kubelet[2614]: W0213 08:40:58.068461 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.068807 kubelet[2614]: E0213 08:40:58.068507 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.069029 kubelet[2614]: E0213 08:40:58.068984 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.069029 kubelet[2614]: W0213 08:40:58.069010 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.069338 kubelet[2614]: E0213 08:40:58.069048 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.069556 kubelet[2614]: E0213 08:40:58.069526 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.069680 kubelet[2614]: W0213 08:40:58.069553 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.069804 kubelet[2614]: E0213 08:40:58.069684 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.070099 kubelet[2614]: E0213 08:40:58.070013 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.070099 kubelet[2614]: W0213 08:40:58.070042 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.070099 kubelet[2614]: E0213 08:40:58.070078 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.070610 kubelet[2614]: E0213 08:40:58.070581 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.070610 kubelet[2614]: W0213 08:40:58.070607 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.070837 kubelet[2614]: E0213 08:40:58.070654 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.080778 kubelet[2614]: E0213 08:40:58.080733 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.080778 kubelet[2614]: W0213 08:40:58.080765 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.081143 kubelet[2614]: E0213 08:40:58.080804 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.081708 kubelet[2614]: E0213 08:40:58.081664 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.081708 kubelet[2614]: W0213 08:40:58.081691 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.082152 kubelet[2614]: E0213 08:40:58.081726 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.088783 kubelet[2614]: E0213 08:40:58.088686 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.088783 kubelet[2614]: W0213 08:40:58.088718 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.088783 kubelet[2614]: E0213 08:40:58.088773 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.976492 kubelet[2614]: E0213 08:40:58.976379 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.976492 kubelet[2614]: W0213 08:40:58.976423 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.976492 kubelet[2614]: E0213 08:40:58.976469 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.977581 kubelet[2614]: E0213 08:40:58.977025 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.977581 kubelet[2614]: W0213 08:40:58.977058 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.977581 kubelet[2614]: E0213 08:40:58.977096 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.977908 kubelet[2614]: E0213 08:40:58.977623 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.977908 kubelet[2614]: W0213 08:40:58.977654 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.977908 kubelet[2614]: E0213 08:40:58.977692 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.978387 kubelet[2614]: E0213 08:40:58.978297 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.978387 kubelet[2614]: W0213 08:40:58.978330 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.978387 kubelet[2614]: E0213 08:40:58.978369 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.978913 kubelet[2614]: E0213 08:40:58.978885 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.978913 kubelet[2614]: W0213 08:40:58.978912 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.979156 kubelet[2614]: E0213 08:40:58.978973 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.979533 kubelet[2614]: E0213 08:40:58.979446 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.979533 kubelet[2614]: W0213 08:40:58.979472 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.979533 kubelet[2614]: E0213 08:40:58.979508 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.980158 kubelet[2614]: E0213 08:40:58.980087 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.980158 kubelet[2614]: W0213 08:40:58.980113 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.980158 kubelet[2614]: E0213 08:40:58.980146 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.980712 kubelet[2614]: E0213 08:40:58.980636 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.980712 kubelet[2614]: W0213 08:40:58.980669 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.980712 kubelet[2614]: E0213 08:40:58.980710 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.981248 kubelet[2614]: E0213 08:40:58.981215 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.981369 kubelet[2614]: W0213 08:40:58.981249 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.981369 kubelet[2614]: E0213 08:40:58.981289 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.981863 kubelet[2614]: E0213 08:40:58.981798 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.981863 kubelet[2614]: W0213 08:40:58.981824 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.981863 kubelet[2614]: E0213 08:40:58.981859 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.982436 kubelet[2614]: E0213 08:40:58.982362 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.982436 kubelet[2614]: W0213 08:40:58.982400 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.982436 kubelet[2614]: E0213 08:40:58.982441 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:58.983056 kubelet[2614]: E0213 08:40:58.982979 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:58.983056 kubelet[2614]: W0213 08:40:58.983007 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:58.983056 kubelet[2614]: E0213 08:40:58.983047 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.072099 kubelet[2614]: E0213 08:40:59.072003 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.072099 kubelet[2614]: W0213 08:40:59.072052 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.072099 kubelet[2614]: E0213 08:40:59.072107 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.072772 kubelet[2614]: E0213 08:40:59.072686 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.072772 kubelet[2614]: W0213 08:40:59.072714 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.072772 kubelet[2614]: E0213 08:40:59.072765 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.073440 kubelet[2614]: E0213 08:40:59.073361 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.073440 kubelet[2614]: W0213 08:40:59.073400 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.073760 kubelet[2614]: E0213 08:40:59.073453 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.074089 kubelet[2614]: E0213 08:40:59.074015 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.074089 kubelet[2614]: W0213 08:40:59.074049 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.074089 kubelet[2614]: E0213 08:40:59.074095 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.074661 kubelet[2614]: E0213 08:40:59.074584 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.074661 kubelet[2614]: W0213 08:40:59.074619 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.074987 kubelet[2614]: E0213 08:40:59.074741 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.075177 kubelet[2614]: E0213 08:40:59.075143 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.075316 kubelet[2614]: W0213 08:40:59.075178 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.075316 kubelet[2614]: E0213 08:40:59.075300 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.075735 kubelet[2614]: E0213 08:40:59.075678 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.075735 kubelet[2614]: W0213 08:40:59.075722 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.075989 kubelet[2614]: E0213 08:40:59.075838 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.076318 kubelet[2614]: E0213 08:40:59.076238 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.076318 kubelet[2614]: W0213 08:40:59.076272 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.076318 kubelet[2614]: E0213 08:40:59.076318 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.077033 kubelet[2614]: E0213 08:40:59.076955 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.077033 kubelet[2614]: W0213 08:40:59.076993 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.077033 kubelet[2614]: E0213 08:40:59.077040 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.077618 kubelet[2614]: E0213 08:40:59.077563 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.077618 kubelet[2614]: W0213 08:40:59.077597 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.077871 kubelet[2614]: E0213 08:40:59.077717 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.078146 kubelet[2614]: E0213 08:40:59.078060 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.078146 kubelet[2614]: W0213 08:40:59.078087 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.078480 kubelet[2614]: E0213 08:40:59.078181 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.078617 kubelet[2614]: E0213 08:40:59.078591 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.078727 kubelet[2614]: W0213 08:40:59.078624 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.078846 kubelet[2614]: E0213 08:40:59.078761 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.079260 kubelet[2614]: E0213 08:40:59.079166 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.079260 kubelet[2614]: W0213 08:40:59.079201 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.079260 kubelet[2614]: E0213 08:40:59.079247 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.079831 kubelet[2614]: E0213 08:40:59.079800 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.079831 kubelet[2614]: W0213 08:40:59.079828 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.080113 kubelet[2614]: E0213 08:40:59.079972 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.080425 kubelet[2614]: E0213 08:40:59.080351 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.080425 kubelet[2614]: W0213 08:40:59.080385 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.080425 kubelet[2614]: E0213 08:40:59.080425 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.080898 kubelet[2614]: E0213 08:40:59.080867 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.080898 kubelet[2614]: W0213 08:40:59.080896 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.081162 kubelet[2614]: E0213 08:40:59.080960 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.081530 kubelet[2614]: E0213 08:40:59.081462 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.081530 kubelet[2614]: W0213 08:40:59.081496 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.081792 kubelet[2614]: E0213 08:40:59.081538 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.082425 kubelet[2614]: E0213 08:40:59.082331 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.082425 kubelet[2614]: W0213 08:40:59.082367 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.082425 kubelet[2614]: E0213 08:40:59.082405 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.264489 kubelet[2614]: E0213 08:40:59.264406 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:40:59.891263 kubelet[2614]: E0213 08:40:59.891227 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.891263 kubelet[2614]: W0213 08:40:59.891237 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.891263 kubelet[2614]: E0213 08:40:59.891249 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.891404 kubelet[2614]: E0213 08:40:59.891378 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.891404 kubelet[2614]: W0213 08:40:59.891385 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.891404 kubelet[2614]: E0213 08:40:59.891392 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.891576 kubelet[2614]: E0213 08:40:59.891538 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.891576 kubelet[2614]: W0213 08:40:59.891545 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.891576 kubelet[2614]: E0213 08:40:59.891554 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.891730 kubelet[2614]: E0213 08:40:59.891700 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.891730 kubelet[2614]: W0213 08:40:59.891705 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.891730 kubelet[2614]: E0213 08:40:59.891712 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.891821 kubelet[2614]: E0213 08:40:59.891816 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.891842 kubelet[2614]: W0213 08:40:59.891822 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.891842 kubelet[2614]: E0213 08:40:59.891827 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.891901 kubelet[2614]: E0213 08:40:59.891896 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.891901 kubelet[2614]: W0213 08:40:59.891900 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.891945 kubelet[2614]: E0213 08:40:59.891906 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.892027 kubelet[2614]: E0213 08:40:59.891994 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.892027 kubelet[2614]: W0213 08:40:59.891998 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.892027 kubelet[2614]: E0213 08:40:59.892004 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.892109 kubelet[2614]: E0213 08:40:59.892097 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.892109 kubelet[2614]: W0213 08:40:59.892101 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.892109 kubelet[2614]: E0213 08:40:59.892107 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.892238 kubelet[2614]: E0213 08:40:59.892201 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.892238 kubelet[2614]: W0213 08:40:59.892205 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.892238 kubelet[2614]: E0213 08:40:59.892211 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.892316 kubelet[2614]: E0213 08:40:59.892306 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.892316 kubelet[2614]: W0213 08:40:59.892310 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.892316 kubelet[2614]: E0213 08:40:59.892316 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.892380 kubelet[2614]: E0213 08:40:59.892375 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.892380 kubelet[2614]: W0213 08:40:59.892379 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.892470 kubelet[2614]: E0213 08:40:59.892384 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.892470 kubelet[2614]: E0213 08:40:59.892444 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.892470 kubelet[2614]: W0213 08:40:59.892448 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.892470 kubelet[2614]: E0213 08:40:59.892453 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.982621 kubelet[2614]: E0213 08:40:59.982499 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.982621 kubelet[2614]: W0213 08:40:59.982559 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.982621 kubelet[2614]: E0213 08:40:59.982616 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.983716 kubelet[2614]: E0213 08:40:59.983149 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.983716 kubelet[2614]: W0213 08:40:59.983174 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.983716 kubelet[2614]: E0213 08:40:59.983231 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.984082 kubelet[2614]: E0213 08:40:59.983785 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.984082 kubelet[2614]: W0213 08:40:59.983820 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.984082 kubelet[2614]: E0213 08:40:59.983871 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.984485 kubelet[2614]: E0213 08:40:59.984413 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.984485 kubelet[2614]: W0213 08:40:59.984451 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.984719 kubelet[2614]: E0213 08:40:59.984498 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.985064 kubelet[2614]: E0213 08:40:59.984983 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.985064 kubelet[2614]: W0213 08:40:59.985008 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.985412 kubelet[2614]: E0213 08:40:59.985145 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.985556 kubelet[2614]: E0213 08:40:59.985525 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.985678 kubelet[2614]: W0213 08:40:59.985558 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.985808 kubelet[2614]: E0213 08:40:59.985680 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.986098 kubelet[2614]: E0213 08:40:59.986053 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.986098 kubelet[2614]: W0213 08:40:59.986082 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.986374 kubelet[2614]: E0213 08:40:59.986171 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.986609 kubelet[2614]: E0213 08:40:59.986533 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.986609 kubelet[2614]: W0213 08:40:59.986568 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.986609 kubelet[2614]: E0213 08:40:59.986615 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.987247 kubelet[2614]: E0213 08:40:59.987172 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.987247 kubelet[2614]: W0213 08:40:59.987207 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.987247 kubelet[2614]: E0213 08:40:59.987246 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.987765 kubelet[2614]: E0213 08:40:59.987728 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.987765 kubelet[2614]: W0213 08:40:59.987763 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.988053 kubelet[2614]: E0213 08:40:59.987811 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.988438 kubelet[2614]: E0213 08:40:59.988364 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.988438 kubelet[2614]: W0213 08:40:59.988397 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.988438 kubelet[2614]: E0213 08:40:59.988445 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.988917 kubelet[2614]: E0213 08:40:59.988886 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.988917 kubelet[2614]: W0213 08:40:59.988914 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.989190 kubelet[2614]: E0213 08:40:59.989009 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.989362 kubelet[2614]: E0213 08:40:59.989327 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.989362 kubelet[2614]: W0213 08:40:59.989354 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.989653 kubelet[2614]: E0213 08:40:59.989393 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.989802 kubelet[2614]: E0213 08:40:59.989772 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.989920 kubelet[2614]: W0213 08:40:59.989804 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.989920 kubelet[2614]: E0213 08:40:59.989855 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.990371 kubelet[2614]: E0213 08:40:59.990344 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.990506 kubelet[2614]: W0213 08:40:59.990371 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.990506 kubelet[2614]: E0213 08:40:59.990416 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.991001 kubelet[2614]: E0213 08:40:59.990960 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.991001 kubelet[2614]: W0213 08:40:59.990996 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.991277 kubelet[2614]: E0213 08:40:59.991044 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.991569 kubelet[2614]: E0213 08:40:59.991515 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.991569 kubelet[2614]: W0213 08:40:59.991541 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.991813 kubelet[2614]: E0213 08:40:59.991577 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:40:59.991971 kubelet[2614]: E0213 08:40:59.991940 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:40:59.991971 kubelet[2614]: W0213 08:40:59.991968 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:40:59.992194 kubelet[2614]: E0213 08:40:59.992002 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:00.056699 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4023797946.mount: Deactivated successfully. Feb 13 08:41:00.470084 kubelet[2614]: E0213 08:41:00.469882 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:01.265543 kubelet[2614]: E0213 08:41:01.265470 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:01.766017 systemd[1]: Started sshd@94-145.40.67.89:22-139.178.68.195:52816.service. Feb 13 08:41:01.765000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.67.89:22-139.178.68.195:52816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:01.793474 kernel: kauditd_printk_skb: 64 callbacks suppressed Feb 13 08:41:01.793653 kernel: audit: type=1130 audit(1707813661.765:1936): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.67.89:22-139.178.68.195:52816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:01.913296 sshd[5972]: Accepted publickey for core from 139.178.68.195 port 52816 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:01.912000 audit[5972]: USER_ACCT pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:01.914746 sshd[5972]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:01.917215 systemd-logind[1463]: New session 75 of user core. Feb 13 08:41:01.917744 systemd[1]: Started session-75.scope. Feb 13 08:41:01.995959 sshd[5972]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:01.997426 systemd[1]: sshd@94-145.40.67.89:22-139.178.68.195:52816.service: Deactivated successfully. Feb 13 08:41:01.997855 systemd[1]: session-75.scope: Deactivated successfully. Feb 13 08:41:01.998280 systemd-logind[1463]: Session 75 logged out. Waiting for processes to exit. Feb 13 08:41:01.998744 systemd-logind[1463]: Removed session 75. Feb 13 08:41:01.913000 audit[5972]: CRED_ACQ pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.096639 kernel: audit: type=1101 audit(1707813661.912:1937): pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.096728 kernel: audit: type=1103 audit(1707813661.913:1938): pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.096767 kernel: audit: type=1006 audit(1707813661.913:1939): pid=5972 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=75 res=1 Feb 13 08:41:01.913000 audit[5972]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffebaa46640 a2=3 a3=0 items=0 ppid=1 pid=5972 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:02.248372 kernel: audit: type=1300 audit(1707813661.913:1939): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffebaa46640 a2=3 a3=0 items=0 ppid=1 pid=5972 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:02.248432 kernel: audit: type=1327 audit(1707813661.913:1939): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:01.913000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:01.919000 audit[5972]: USER_START pid=5972 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.375066 kernel: audit: type=1105 audit(1707813661.919:1940): pid=5972 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.375106 kernel: audit: type=1103 audit(1707813661.919:1941): pid=5974 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:01.919000 audit[5974]: CRED_ACQ pid=5974 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:01.995000 audit[5972]: USER_END pid=5972 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.560507 kernel: audit: type=1106 audit(1707813661.995:1942): pid=5972 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:02.560609 kernel: audit: type=1104 audit(1707813661.995:1943): pid=5972 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:01.995000 audit[5972]: CRED_DISP pid=5972 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:01.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-145.40.67.89:22-139.178.68.195:52816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:03.264699 kubelet[2614]: E0213 08:41:03.264677 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:05.265315 kubelet[2614]: E0213 08:41:05.265215 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:05.471735 kubelet[2614]: E0213 08:41:05.471668 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:06.103801 systemd[1]: Started sshd@95-145.40.67.89:22-43.153.15.221:60812.service. Feb 13 08:41:06.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.67.89:22-43.153.15.221:60812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:06.260804 sshd[5998]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:41:06.260000 audit[5998]: USER_AUTH pid=5998 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:41:07.007090 systemd[1]: Started sshd@96-145.40.67.89:22-139.178.68.195:36950.service. Feb 13 08:41:07.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.67.89:22-139.178.68.195:36950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:07.048386 kernel: kauditd_printk_skb: 3 callbacks suppressed Feb 13 08:41:07.048477 kernel: audit: type=1130 audit(1707813667.006:1947): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.67.89:22-139.178.68.195:36950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:07.165000 audit[6001]: USER_ACCT pid=6001 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.166659 sshd[6001]: Accepted publickey for core from 139.178.68.195 port 36950 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:07.168823 sshd[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:07.174445 systemd-logind[1463]: New session 76 of user core. Feb 13 08:41:07.175899 systemd[1]: Started session-76.scope. Feb 13 08:41:07.260172 sshd[6001]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:07.167000 audit[6001]: CRED_ACQ pid=6001 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.260938 kernel: audit: type=1101 audit(1707813667.165:1948): pid=6001 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.260970 kernel: audit: type=1103 audit(1707813667.167:1949): pid=6001 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.261761 systemd[1]: sshd@96-145.40.67.89:22-139.178.68.195:36950.service: Deactivated successfully. Feb 13 08:41:07.262286 systemd[1]: session-76.scope: Deactivated successfully. Feb 13 08:41:07.262590 systemd-logind[1463]: Session 76 logged out. Waiting for processes to exit. Feb 13 08:41:07.263009 systemd-logind[1463]: Removed session 76. Feb 13 08:41:07.410737 kernel: audit: type=1006 audit(1707813667.167:1950): pid=6001 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=76 res=1 Feb 13 08:41:07.410816 kernel: audit: type=1300 audit(1707813667.167:1950): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff39c7f2d0 a2=3 a3=0 items=0 ppid=1 pid=6001 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:07.167000 audit[6001]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff39c7f2d0 a2=3 a3=0 items=0 ppid=1 pid=6001 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:07.411171 kubelet[2614]: E0213 08:41:07.411121 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:07.503309 kernel: audit: type=1327 audit(1707813667.167:1950): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:07.167000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:07.534016 kernel: audit: type=1105 audit(1707813667.181:1951): pid=6001 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.181000 audit[6001]: USER_START pid=6001 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.629098 kernel: audit: type=1103 audit(1707813667.183:1952): pid=6003 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.183000 audit[6003]: CRED_ACQ pid=6003 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.718382 kernel: audit: type=1106 audit(1707813667.259:1953): pid=6001 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.259000 audit[6001]: USER_END pid=6001 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.259000 audit[6001]: CRED_DISP pid=6001 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.903340 kernel: audit: type=1104 audit(1707813667.259:1954): pid=6001 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:07.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-145.40.67.89:22-139.178.68.195:36950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:08.839425 sshd[5998]: Failed password for root from 43.153.15.221 port 60812 ssh2 Feb 13 08:41:09.119702 sshd[5998]: Received disconnect from 43.153.15.221 port 60812:11: Bye Bye [preauth] Feb 13 08:41:09.119702 sshd[5998]: Disconnected from authenticating user root 43.153.15.221 port 60812 [preauth] Feb 13 08:41:09.122176 systemd[1]: sshd@95-145.40.67.89:22-43.153.15.221:60812.service: Deactivated successfully. Feb 13 08:41:09.121000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-145.40.67.89:22-43.153.15.221:60812 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:09.265430 kubelet[2614]: E0213 08:41:09.265322 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:10.472847 kubelet[2614]: E0213 08:41:10.472733 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:11.265483 kubelet[2614]: E0213 08:41:11.265378 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:12.262778 systemd[1]: Started sshd@97-145.40.67.89:22-139.178.68.195:36952.service. Feb 13 08:41:12.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.67.89:22-139.178.68.195:36952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:12.289735 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:41:12.289804 kernel: audit: type=1130 audit(1707813672.261:1957): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.67.89:22-139.178.68.195:36952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:12.406000 audit[6027]: USER_ACCT pid=6027 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.408208 sshd[6027]: Accepted publickey for core from 139.178.68.195 port 36952 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:12.411340 sshd[6027]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:12.416617 systemd-logind[1463]: New session 77 of user core. Feb 13 08:41:12.417146 systemd[1]: Started session-77.scope. Feb 13 08:41:12.498050 sshd[6027]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:12.499395 systemd[1]: sshd@97-145.40.67.89:22-139.178.68.195:36952.service: Deactivated successfully. Feb 13 08:41:12.499809 systemd[1]: session-77.scope: Deactivated successfully. Feb 13 08:41:12.500175 systemd-logind[1463]: Session 77 logged out. Waiting for processes to exit. Feb 13 08:41:12.500623 systemd-logind[1463]: Removed session 77. Feb 13 08:41:12.409000 audit[6027]: CRED_ACQ pid=6027 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.591643 kernel: audit: type=1101 audit(1707813672.406:1958): pid=6027 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.591682 kernel: audit: type=1103 audit(1707813672.409:1959): pid=6027 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.591699 kernel: audit: type=1006 audit(1707813672.409:1960): pid=6027 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=77 res=1 Feb 13 08:41:12.650239 kernel: audit: type=1300 audit(1707813672.409:1960): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee6f48b60 a2=3 a3=0 items=0 ppid=1 pid=6027 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:12.409000 audit[6027]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffee6f48b60 a2=3 a3=0 items=0 ppid=1 pid=6027 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:12.409000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:12.772839 kernel: audit: type=1327 audit(1707813672.409:1960): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:12.772869 kernel: audit: type=1105 audit(1707813672.418:1961): pid=6027 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.418000 audit[6027]: USER_START pid=6027 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.867349 kernel: audit: type=1103 audit(1707813672.419:1962): pid=6029 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.419000 audit[6029]: CRED_ACQ pid=6029 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.878362 systemd[1]: Started sshd@98-145.40.67.89:22-161.35.108.241:49506.service. Feb 13 08:41:12.956642 kernel: audit: type=1106 audit(1707813672.497:1963): pid=6027 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.497000 audit[6027]: USER_END pid=6027 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:13.052116 kernel: audit: type=1104 audit(1707813672.497:1964): pid=6027 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.497000 audit[6027]: CRED_DISP pid=6027 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:12.498000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-145.40.67.89:22-139.178.68.195:36952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:12.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.67.89:22-161.35.108.241:49506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:13.264900 kubelet[2614]: E0213 08:41:13.264850 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:13.354307 kubelet[2614]: E0213 08:41:13.354220 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.354857 kubelet[2614]: W0213 08:41:13.354774 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.355339 kubelet[2614]: E0213 08:41:13.355285 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.357182 kubelet[2614]: E0213 08:41:13.357096 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.357182 kubelet[2614]: W0213 08:41:13.357136 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.357182 kubelet[2614]: E0213 08:41:13.357180 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.357787 kubelet[2614]: E0213 08:41:13.357704 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.357787 kubelet[2614]: W0213 08:41:13.357740 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.357787 kubelet[2614]: E0213 08:41:13.357779 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.358462 kubelet[2614]: E0213 08:41:13.358385 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.358462 kubelet[2614]: W0213 08:41:13.358421 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.358462 kubelet[2614]: E0213 08:41:13.358465 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.359034 kubelet[2614]: E0213 08:41:13.358956 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.359034 kubelet[2614]: W0213 08:41:13.358983 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.359034 kubelet[2614]: E0213 08:41:13.359018 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.359495 kubelet[2614]: E0213 08:41:13.359461 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.359611 kubelet[2614]: W0213 08:41:13.359494 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.359611 kubelet[2614]: E0213 08:41:13.359534 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.360141 kubelet[2614]: E0213 08:41:13.360068 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.360141 kubelet[2614]: W0213 08:41:13.360096 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.360141 kubelet[2614]: E0213 08:41:13.360131 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.360665 kubelet[2614]: E0213 08:41:13.360627 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.360665 kubelet[2614]: W0213 08:41:13.360661 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.360991 kubelet[2614]: E0213 08:41:13.360701 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.361232 kubelet[2614]: E0213 08:41:13.361190 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.361232 kubelet[2614]: W0213 08:41:13.361225 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.361547 kubelet[2614]: E0213 08:41:13.361265 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.361827 kubelet[2614]: E0213 08:41:13.361772 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.361827 kubelet[2614]: W0213 08:41:13.361799 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.362084 kubelet[2614]: E0213 08:41:13.361835 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.362431 kubelet[2614]: E0213 08:41:13.362350 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.362431 kubelet[2614]: W0213 08:41:13.362384 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.362431 kubelet[2614]: E0213 08:41:13.362423 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.362918 kubelet[2614]: E0213 08:41:13.362888 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.362918 kubelet[2614]: W0213 08:41:13.362916 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.363163 kubelet[2614]: E0213 08:41:13.362965 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.363567 kubelet[2614]: E0213 08:41:13.363484 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.363567 kubelet[2614]: W0213 08:41:13.363518 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.363567 kubelet[2614]: E0213 08:41:13.363557 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.364048 kubelet[2614]: E0213 08:41:13.363945 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.364048 kubelet[2614]: W0213 08:41:13.363972 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.364048 kubelet[2614]: E0213 08:41:13.364005 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.364603 kubelet[2614]: E0213 08:41:13.364531 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.364603 kubelet[2614]: W0213 08:41:13.364565 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.364603 kubelet[2614]: E0213 08:41:13.364604 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.365140 kubelet[2614]: E0213 08:41:13.365095 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.365140 kubelet[2614]: W0213 08:41:13.365123 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.365140 kubelet[2614]: E0213 08:41:13.365156 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.365581 kubelet[2614]: E0213 08:41:13.365558 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.365698 kubelet[2614]: W0213 08:41:13.365582 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.365698 kubelet[2614]: E0213 08:41:13.365616 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.365983 kubelet[2614]: E0213 08:41:13.365964 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.366108 kubelet[2614]: W0213 08:41:13.365987 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.366108 kubelet[2614]: E0213 08:41:13.366016 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.366510 kubelet[2614]: E0213 08:41:13.366438 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.366510 kubelet[2614]: W0213 08:41:13.366472 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.366510 kubelet[2614]: E0213 08:41:13.366511 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.366918 kubelet[2614]: E0213 08:41:13.366892 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 08:41:13.367057 kubelet[2614]: W0213 08:41:13.366917 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 08:41:13.367057 kubelet[2614]: E0213 08:41:13.366971 2614 plugins.go:736] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 08:41:13.378467 sshd[6052]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:41:13.377000 audit[6052]: USER_AUTH pid=6052 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:41:15.264210 kubelet[2614]: E0213 08:41:15.264160 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:15.474554 kubelet[2614]: E0213 08:41:15.474451 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:15.585647 sshd[6052]: Failed password for root from 161.35.108.241 port 49506 ssh2 Feb 13 08:41:16.300800 sshd[6052]: Received disconnect from 161.35.108.241 port 49506:11: Bye Bye [preauth] Feb 13 08:41:16.300800 sshd[6052]: Disconnected from authenticating user root 161.35.108.241 port 49506 [preauth] Feb 13 08:41:16.303491 systemd[1]: sshd@98-145.40.67.89:22-161.35.108.241:49506.service: Deactivated successfully. Feb 13 08:41:16.303000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-145.40.67.89:22-161.35.108.241:49506 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:16.494696 env[1475]: time="2024-02-13T08:41:16.494644337Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:16.495354 env[1475]: time="2024-02-13T08:41:16.495306477Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:16.496471 env[1475]: time="2024-02-13T08:41:16.496428489Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:16.497810 env[1475]: time="2024-02-13T08:41:16.497768950Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b05edbd1f80db4ada229e6001a666a7dd36bb6ab617143684fb3d28abfc4b71e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:16.498231 env[1475]: time="2024-02-13T08:41:16.498188731Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.27.0\" returns image reference \"sha256:6506d2e0be2d5ec9cb8dbe00c4b4f037c67b6ab4ec14a1f0c83333ac51f4da9a\"" Feb 13 08:41:16.499508 env[1475]: time="2024-02-13T08:41:16.499489330Z" level=info msg="CreateContainer within sandbox \"ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 08:41:16.524059 env[1475]: time="2024-02-13T08:41:16.524013587Z" level=info msg="CreateContainer within sandbox \"ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1\"" Feb 13 08:41:16.524358 env[1475]: time="2024-02-13T08:41:16.524330780Z" level=info msg="StartContainer for \"03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1\"" Feb 13 08:41:16.533963 systemd[1]: Started cri-containerd-03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1.scope. Feb 13 08:41:16.539000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.539000 audit[6082]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001976b0 a2=3c a3=7fc5381ce3b8 items=0 ppid=3276 pid=6082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:16.539000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623761343662326562303039623534366437333232663462346533 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit: BPF prog-id=139 op=LOAD Feb 13 08:41:16.540000 audit[6082]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001979d8 a2=78 a3=c00029ce08 items=0 ppid=3276 pid=6082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:16.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623761343662326562303039623534366437333232663462346533 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit: BPF prog-id=140 op=LOAD Feb 13 08:41:16.540000 audit[6082]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000197770 a2=78 a3=c00029ce58 items=0 ppid=3276 pid=6082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:16.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623761343662326562303039623534366437333232663462346533 Feb 13 08:41:16.540000 audit: BPF prog-id=140 op=UNLOAD Feb 13 08:41:16.540000 audit: BPF prog-id=139 op=UNLOAD Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { perfmon } for pid=6082 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit[6082]: AVC avc: denied { bpf } for pid=6082 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:16.540000 audit: BPF prog-id=141 op=LOAD Feb 13 08:41:16.540000 audit[6082]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000197c30 a2=78 a3=c00029cee8 items=0 ppid=3276 pid=6082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:16.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033623761343662326562303039623534366437333232663462346533 Feb 13 08:41:16.547072 env[1475]: time="2024-02-13T08:41:16.547042157Z" level=info msg="StartContainer for \"03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1\" returns successfully" Feb 13 08:41:16.553728 systemd[1]: cri-containerd-03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1.scope: Deactivated successfully. Feb 13 08:41:16.567077 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1-rootfs.mount: Deactivated successfully. Feb 13 08:41:16.572000 audit: BPF prog-id=141 op=UNLOAD Feb 13 08:41:16.768196 env[1475]: time="2024-02-13T08:41:16.768058859Z" level=info msg="shim disconnected" id=03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1 Feb 13 08:41:16.768618 env[1475]: time="2024-02-13T08:41:16.768201405Z" level=warning msg="cleaning up after shim disconnected" id=03b7a46b2eb009b546d7322f4b4e334c3233871e7e7ee512db9ed00c0741e9e1 namespace=k8s.io Feb 13 08:41:16.768618 env[1475]: time="2024-02-13T08:41:16.768238779Z" level=info msg="cleaning up dead shim" Feb 13 08:41:16.776267 env[1475]: time="2024-02-13T08:41:16.776224610Z" level=warning msg="cleanup warnings time=\"2024-02-13T08:41:16Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=6123 runtime=io.containerd.runc.v2\n" Feb 13 08:41:16.930583 env[1475]: time="2024-02-13T08:41:16.930493080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\"" Feb 13 08:41:17.264893 kubelet[2614]: E0213 08:41:17.264816 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:17.507875 systemd[1]: Started sshd@99-145.40.67.89:22-139.178.68.195:44134.service. Feb 13 08:41:17.507000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.67.89:22-139.178.68.195:44134 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:17.534920 kernel: kauditd_printk_skb: 48 callbacks suppressed Feb 13 08:41:17.534998 kernel: audit: type=1130 audit(1707813677.507:1976): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.67.89:22-139.178.68.195:44134 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:17.651000 audit[6142]: USER_ACCT pid=6142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.652962 sshd[6142]: Accepted publickey for core from 139.178.68.195 port 44134 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:17.654223 sshd[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:17.656521 systemd-logind[1463]: New session 78 of user core. Feb 13 08:41:17.657026 systemd[1]: Started session-78.scope. Feb 13 08:41:17.735024 sshd[6142]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:17.736477 systemd[1]: sshd@99-145.40.67.89:22-139.178.68.195:44134.service: Deactivated successfully. Feb 13 08:41:17.736895 systemd[1]: session-78.scope: Deactivated successfully. Feb 13 08:41:17.737247 systemd-logind[1463]: Session 78 logged out. Waiting for processes to exit. Feb 13 08:41:17.737708 systemd-logind[1463]: Removed session 78. Feb 13 08:41:17.653000 audit[6142]: CRED_ACQ pid=6142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.834847 kernel: audit: type=1101 audit(1707813677.651:1977): pid=6142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.834884 kernel: audit: type=1103 audit(1707813677.653:1978): pid=6142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.834900 kernel: audit: type=1006 audit(1707813677.653:1979): pid=6142 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=78 res=1 Feb 13 08:41:17.893447 kernel: audit: type=1300 audit(1707813677.653:1979): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffea62a8580 a2=3 a3=0 items=0 ppid=1 pid=6142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:17.653000 audit[6142]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffea62a8580 a2=3 a3=0 items=0 ppid=1 pid=6142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:17.653000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:18.016012 kernel: audit: type=1327 audit(1707813677.653:1979): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:18.016043 kernel: audit: type=1105 audit(1707813677.658:1980): pid=6142 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.658000 audit[6142]: USER_START pid=6142 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:18.110615 kernel: audit: type=1103 audit(1707813677.658:1981): pid=6144 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.658000 audit[6144]: CRED_ACQ pid=6144 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:18.200104 kernel: audit: type=1106 audit(1707813677.734:1982): pid=6142 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.734000 audit[6142]: USER_END pid=6142 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:18.295690 kernel: audit: type=1104 audit(1707813677.734:1983): pid=6142 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.734000 audit[6142]: CRED_DISP pid=6142 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:17.735000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-145.40.67.89:22-139.178.68.195:44134 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:19.265194 kubelet[2614]: E0213 08:41:19.265148 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:20.441690 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3078170191.mount: Deactivated successfully. Feb 13 08:41:20.476155 kubelet[2614]: E0213 08:41:20.475995 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:21.264596 kubelet[2614]: E0213 08:41:21.264491 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:22.743928 systemd[1]: Started sshd@100-145.40.67.89:22-139.178.68.195:44138.service. Feb 13 08:41:22.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.67.89:22-139.178.68.195:44138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:22.770908 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:41:22.770962 kernel: audit: type=1130 audit(1707813682.743:1985): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.67.89:22-139.178.68.195:44138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:22.887000 audit[6166]: USER_ACCT pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:22.888397 sshd[6166]: Accepted publickey for core from 139.178.68.195 port 44138 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:22.889247 sshd[6166]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:22.891569 systemd-logind[1463]: New session 79 of user core. Feb 13 08:41:22.892029 systemd[1]: Started session-79.scope. Feb 13 08:41:22.972047 sshd[6166]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:22.973566 systemd[1]: sshd@100-145.40.67.89:22-139.178.68.195:44138.service: Deactivated successfully. Feb 13 08:41:22.974124 systemd[1]: session-79.scope: Deactivated successfully. Feb 13 08:41:22.974542 systemd-logind[1463]: Session 79 logged out. Waiting for processes to exit. Feb 13 08:41:22.975010 systemd-logind[1463]: Removed session 79. Feb 13 08:41:22.888000 audit[6166]: CRED_ACQ pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:23.072355 kernel: audit: type=1101 audit(1707813682.887:1986): pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:23.072394 kernel: audit: type=1103 audit(1707813682.888:1987): pid=6166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:23.072411 kernel: audit: type=1006 audit(1707813682.888:1988): pid=6166 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=79 res=1 Feb 13 08:41:23.131145 kernel: audit: type=1300 audit(1707813682.888:1988): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc9ce9e40 a2=3 a3=0 items=0 ppid=1 pid=6166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:22.888000 audit[6166]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc9ce9e40 a2=3 a3=0 items=0 ppid=1 pid=6166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:23.223267 kernel: audit: type=1327 audit(1707813682.888:1988): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:22.888000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:23.253806 kernel: audit: type=1105 audit(1707813682.893:1989): pid=6166 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:22.893000 audit[6166]: USER_START pid=6166 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:23.264216 kubelet[2614]: E0213 08:41:23.264178 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:23.348442 kernel: audit: type=1103 audit(1707813682.894:1990): pid=6168 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:22.894000 audit[6168]: CRED_ACQ pid=6168 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:23.437764 kernel: audit: type=1106 audit(1707813682.971:1991): pid=6166 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:22.971000 audit[6166]: USER_END pid=6166 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:23.533392 kernel: audit: type=1104 audit(1707813682.971:1992): pid=6166 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:22.971000 audit[6166]: CRED_DISP pid=6166 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:22.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-145.40.67.89:22-139.178.68.195:44138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:25.264325 kubelet[2614]: E0213 08:41:25.264275 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:25.477468 kubelet[2614]: E0213 08:41:25.477378 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:27.265362 kubelet[2614]: E0213 08:41:27.265279 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:27.982052 systemd[1]: Started sshd@101-145.40.67.89:22-139.178.68.195:33694.service. Feb 13 08:41:27.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.67.89:22-139.178.68.195:33694 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:28.008993 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:41:28.009074 kernel: audit: type=1130 audit(1707813687.981:1994): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.67.89:22-139.178.68.195:33694 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:28.126000 audit[6194]: USER_ACCT pid=6194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.127720 sshd[6194]: Accepted publickey for core from 139.178.68.195 port 33694 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:28.129635 sshd[6194]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:28.132159 systemd-logind[1463]: New session 80 of user core. Feb 13 08:41:28.132572 systemd[1]: Started session-80.scope. Feb 13 08:41:28.211189 sshd[6194]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:28.212623 systemd[1]: sshd@101-145.40.67.89:22-139.178.68.195:33694.service: Deactivated successfully. Feb 13 08:41:28.213060 systemd[1]: session-80.scope: Deactivated successfully. Feb 13 08:41:28.213505 systemd-logind[1463]: Session 80 logged out. Waiting for processes to exit. Feb 13 08:41:28.213931 systemd-logind[1463]: Removed session 80. Feb 13 08:41:28.128000 audit[6194]: CRED_ACQ pid=6194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.310009 kernel: audit: type=1101 audit(1707813688.126:1995): pid=6194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.310050 kernel: audit: type=1103 audit(1707813688.128:1996): pid=6194 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.310069 kernel: audit: type=1006 audit(1707813688.128:1997): pid=6194 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=80 res=1 Feb 13 08:41:28.368614 kernel: audit: type=1300 audit(1707813688.128:1997): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff587833a0 a2=3 a3=0 items=0 ppid=1 pid=6194 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:28.128000 audit[6194]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff587833a0 a2=3 a3=0 items=0 ppid=1 pid=6194 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:28.460728 kernel: audit: type=1327 audit(1707813688.128:1997): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:28.128000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:28.491296 kernel: audit: type=1105 audit(1707813688.133:1998): pid=6194 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.133000 audit[6194]: USER_START pid=6194 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.134000 audit[6196]: CRED_ACQ pid=6196 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.675238 kernel: audit: type=1103 audit(1707813688.134:1999): pid=6196 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.675274 kernel: audit: type=1106 audit(1707813688.210:2000): pid=6194 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.210000 audit[6194]: USER_END pid=6194 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.770841 kernel: audit: type=1104 audit(1707813688.211:2001): pid=6194 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.211000 audit[6194]: CRED_DISP pid=6194 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:28.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-145.40.67.89:22-139.178.68.195:33694 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:29.265262 kubelet[2614]: E0213 08:41:29.265169 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:30.480962 kubelet[2614]: E0213 08:41:30.480844 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:31.265275 kubelet[2614]: E0213 08:41:31.265202 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:33.222481 systemd[1]: Started sshd@102-145.40.67.89:22-139.178.68.195:33698.service. Feb 13 08:41:33.221000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.67.89:22-139.178.68.195:33698 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:33.258831 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:41:33.258912 kernel: audit: type=1130 audit(1707813693.221:2003): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.67.89:22-139.178.68.195:33698 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:33.265186 kubelet[2614]: E0213 08:41:33.265125 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:33.375000 audit[6219]: USER_ACCT pid=6219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.376413 sshd[6219]: Accepted publickey for core from 139.178.68.195 port 33698 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:33.377210 sshd[6219]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:33.379643 systemd-logind[1463]: New session 81 of user core. Feb 13 08:41:33.380087 systemd[1]: Started session-81.scope. Feb 13 08:41:33.457733 sshd[6219]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:33.459236 systemd[1]: sshd@102-145.40.67.89:22-139.178.68.195:33698.service: Deactivated successfully. Feb 13 08:41:33.459650 systemd[1]: session-81.scope: Deactivated successfully. Feb 13 08:41:33.459933 systemd-logind[1463]: Session 81 logged out. Waiting for processes to exit. Feb 13 08:41:33.460458 systemd-logind[1463]: Removed session 81. Feb 13 08:41:33.376000 audit[6219]: CRED_ACQ pid=6219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.558446 kernel: audit: type=1101 audit(1707813693.375:2004): pid=6219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.558546 kernel: audit: type=1103 audit(1707813693.376:2005): pid=6219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.558565 kernel: audit: type=1006 audit(1707813693.376:2006): pid=6219 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=81 res=1 Feb 13 08:41:33.617082 kernel: audit: type=1300 audit(1707813693.376:2006): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc5264d910 a2=3 a3=0 items=0 ppid=1 pid=6219 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:33.376000 audit[6219]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc5264d910 a2=3 a3=0 items=0 ppid=1 pid=6219 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:33.709102 kernel: audit: type=1327 audit(1707813693.376:2006): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:33.376000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:33.739655 kernel: audit: type=1105 audit(1707813693.381:2007): pid=6219 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.381000 audit[6219]: USER_START pid=6219 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.834160 kernel: audit: type=1103 audit(1707813693.381:2008): pid=6221 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.381000 audit[6221]: CRED_ACQ pid=6221 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.923426 kernel: audit: type=1106 audit(1707813693.457:2009): pid=6219 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.457000 audit[6219]: USER_END pid=6219 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:34.018988 kernel: audit: type=1104 audit(1707813693.457:2010): pid=6219 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.457000 audit[6219]: CRED_DISP pid=6219 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:33.458000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-145.40.67.89:22-139.178.68.195:33698 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:35.265329 kubelet[2614]: E0213 08:41:35.265227 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:35.482469 kubelet[2614]: E0213 08:41:35.482421 2614 kubelet.go:2475] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" Feb 13 08:41:35.611000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.611000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002355860 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:41:35.611000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:41:35.611000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.611000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00262bf50 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:41:35.611000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:41:35.895000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.895000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c001c85aa0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:41:35.895000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:41:35.895000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.895000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c014404750 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:41:35.895000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:41:35.895000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.895000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.895000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0106ab860 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:41:35.895000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:41:35.895000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0144047b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:41:35.895000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:41:35.895000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.895000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=6a a1=c001c85ae0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:41:35.895000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:35.895000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:41:35.895000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c011b9a780 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:41:35.895000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:41:37.265240 kubelet[2614]: E0213 08:41:37.265193 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:38.149844 env[1475]: time="2024-02-13T08:41:38.149788879Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:38.150500 env[1475]: time="2024-02-13T08:41:38.150433341Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:38.151498 env[1475]: time="2024-02-13T08:41:38.151451494Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.27.0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:38.152871 env[1475]: time="2024-02-13T08:41:38.152836941Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:d943b4c23e82a39b0186a1a3b2fe8f728e543d503df72d7be521501a82b7e7b4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Feb 13 08:41:38.153277 env[1475]: time="2024-02-13T08:41:38.153217345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.27.0\" returns image reference \"sha256:8e8d96a874c0e2f137bc6e0ff4b9da4ac2341852e41d99ab81983d329bb87d93\"" Feb 13 08:41:38.165158 env[1475]: time="2024-02-13T08:41:38.165142002Z" level=info msg="CreateContainer within sandbox \"ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 08:41:38.170854 env[1475]: time="2024-02-13T08:41:38.170829118Z" level=info msg="CreateContainer within sandbox \"ae0f01335ff86a4e7c633b5ba8e67b73bb403c5c3724c4060a40bcade37efaa2\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31\"" Feb 13 08:41:38.171097 env[1475]: time="2024-02-13T08:41:38.171084302Z" level=info msg="StartContainer for \"5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31\"" Feb 13 08:41:38.181159 systemd[1]: Started cri-containerd-5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31.scope. Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=0 a1=c0001496b0 a2=3c a3=7f9e106ab4d8 items=0 ppid=3276 pid=6251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534333465353539366232613035663537616131376562643037626663 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit: BPF prog-id=142 op=LOAD Feb 13 08:41:38.187000 audit[6251]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c0001499d8 a2=78 a3=c000218a08 items=0 ppid=3276 pid=6251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534333465353539366232613035663537616131376562643037626663 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit: BPF prog-id=143 op=LOAD Feb 13 08:41:38.187000 audit[6251]: SYSCALL arch=c000003e syscall=321 success=yes exit=17 a0=5 a1=c000149770 a2=78 a3=c000218a58 items=0 ppid=3276 pid=6251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534333465353539366232613035663537616131376562643037626663 Feb 13 08:41:38.187000 audit: BPF prog-id=143 op=UNLOAD Feb 13 08:41:38.187000 audit: BPF prog-id=142 op=UNLOAD Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { perfmon } for pid=6251 comm="runc" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit[6251]: AVC avc: denied { bpf } for pid=6251 comm="runc" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Feb 13 08:41:38.187000 audit: BPF prog-id=144 op=LOAD Feb 13 08:41:38.187000 audit[6251]: SYSCALL arch=c000003e syscall=321 success=yes exit=15 a0=5 a1=c000149c30 a2=78 a3=c000218ae8 items=0 ppid=3276 pid=6251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/run/torcx/unpack/docker/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.187000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534333465353539366232613035663537616131376562643037626663 Feb 13 08:41:38.194444 env[1475]: time="2024-02-13T08:41:38.194393289Z" level=info msg="StartContainer for \"5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31\" returns successfully" Feb 13 08:41:38.460946 systemd[1]: Started sshd@103-145.40.67.89:22-139.178.68.195:50066.service. Feb 13 08:41:38.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.67.89:22-139.178.68.195:50066 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:38.487672 kernel: kauditd_printk_skb: 68 callbacks suppressed Feb 13 08:41:38.487747 kernel: audit: type=1130 audit(1707813698.460:2026): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.67.89:22-139.178.68.195:50066 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:38.604000 audit[6288]: USER_ACCT pid=6288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.605983 sshd[6288]: Accepted publickey for core from 139.178.68.195 port 50066 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:38.607223 sshd[6288]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:38.609975 systemd-logind[1463]: New session 82 of user core. Feb 13 08:41:38.611091 systemd[1]: Started session-82.scope. Feb 13 08:41:38.695209 sshd[6288]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:38.697055 systemd[1]: sshd@103-145.40.67.89:22-139.178.68.195:50066.service: Deactivated successfully. Feb 13 08:41:38.606000 audit[6288]: CRED_ACQ pid=6288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.697410 systemd[1]: session-82.scope: Deactivated successfully. Feb 13 08:41:38.697727 systemd-logind[1463]: Session 82 logged out. Waiting for processes to exit. Feb 13 08:41:38.698327 systemd[1]: Started sshd@104-145.40.67.89:22-139.178.68.195:50074.service. Feb 13 08:41:38.698688 systemd-logind[1463]: Removed session 82. Feb 13 08:41:38.787823 kernel: audit: type=1101 audit(1707813698.604:2027): pid=6288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.796398 kernel: audit: type=1103 audit(1707813698.606:2028): pid=6288 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.796420 kernel: audit: type=1006 audit(1707813698.606:2029): pid=6288 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=82 res=1 Feb 13 08:41:38.606000 audit[6288]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6b249cc0 a2=3 a3=0 items=0 ppid=1 pid=6288 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.846871 systemd[1]: cri-containerd-5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31.scope: Deactivated successfully. Feb 13 08:41:38.856132 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31-rootfs.mount: Deactivated successfully. Feb 13 08:41:38.875096 sshd[6313]: Accepted publickey for core from 139.178.68.195 port 50074 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:38.878411 sshd[6313]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:38.880890 systemd-logind[1463]: New session 83 of user core. Feb 13 08:41:38.881393 systemd[1]: Started session-83.scope. Feb 13 08:41:38.938700 kernel: audit: type=1300 audit(1707813698.606:2029): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6b249cc0 a2=3 a3=0 items=0 ppid=1 pid=6288 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.938784 kernel: audit: type=1327 audit(1707813698.606:2029): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:38.606000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:38.969311 kernel: audit: type=1105 audit(1707813698.612:2030): pid=6288 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.612000 audit[6288]: USER_START pid=6288 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.063891 kernel: audit: type=1103 audit(1707813698.613:2031): pid=6290 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.613000 audit[6290]: CRED_ACQ pid=6290 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.153218 kernel: audit: type=1106 audit(1707813698.694:2032): pid=6288 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.694000 audit[6288]: USER_END pid=6288 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.248821 kernel: audit: type=1104 audit(1707813698.694:2033): pid=6288 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.694000 audit[6288]: CRED_DISP pid=6288 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.264651 kubelet[2614]: E0213 08:41:39.264610 2614 pod_workers.go:965] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:38.696000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-145.40.67.89:22-139.178.68.195:50066 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:38.697000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.67.89:22-139.178.68.195:50074 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:38.874000 audit[6313]: USER_ACCT pid=6313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.877000 audit[6313]: CRED_ACQ pid=6313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.877000 audit[6313]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb40255a0 a2=3 a3=0 items=0 ppid=1 pid=6313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:38.877000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:38.882000 audit[6313]: USER_START pid=6313 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:38.883000 audit[6340]: CRED_ACQ pid=6340 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.153000 audit: BPF prog-id=144 op=UNLOAD Feb 13 08:41:39.425390 env[1475]: time="2024-02-13T08:41:39.425320355Z" level=info msg="shim disconnected" id=5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31 Feb 13 08:41:39.426005 env[1475]: time="2024-02-13T08:41:39.425392806Z" level=warning msg="cleaning up after shim disconnected" id=5434e5596b2a05f57aa17ebd07bfca88bc48551207fa594219c406da56a61e31 namespace=k8s.io Feb 13 08:41:39.426005 env[1475]: time="2024-02-13T08:41:39.425415255Z" level=info msg="cleaning up dead shim" Feb 13 08:41:39.437229 env[1475]: time="2024-02-13T08:41:39.437184558Z" level=warning msg="cleanup warnings time=\"2024-02-13T08:41:39Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=6360 runtime=io.containerd.runc.v2\n" Feb 13 08:41:39.973610 sshd[6313]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:39.974000 audit[6313]: USER_END pid=6313 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.975000 audit[6313]: CRED_DISP pid=6313 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:39.979972 systemd[1]: sshd@104-145.40.67.89:22-139.178.68.195:50074.service: Deactivated successfully. Feb 13 08:41:39.979000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-145.40.67.89:22-139.178.68.195:50074 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:39.981537 systemd[1]: session-83.scope: Deactivated successfully. Feb 13 08:41:39.983164 systemd-logind[1463]: Session 83 logged out. Waiting for processes to exit. Feb 13 08:41:39.986310 systemd[1]: Started sshd@105-145.40.67.89:22-139.178.68.195:50084.service. Feb 13 08:41:39.985000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.67.89:22-139.178.68.195:50084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:39.988776 systemd-logind[1463]: Removed session 83. Feb 13 08:41:39.990921 env[1475]: time="2024-02-13T08:41:39.990903593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.27.0\"" Feb 13 08:41:40.024251 sshd[6374]: Accepted publickey for core from 139.178.68.195 port 50084 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:40.023000 audit[6374]: USER_ACCT pid=6374 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.024000 audit[6374]: CRED_ACQ pid=6374 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.024000 audit[6374]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc91a5f890 a2=3 a3=0 items=0 ppid=1 pid=6374 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:40.024000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:40.025308 sshd[6374]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:40.028683 systemd-logind[1463]: New session 84 of user core. Feb 13 08:41:40.029351 systemd[1]: Started session-84.scope. Feb 13 08:41:40.031000 audit[6374]: USER_START pid=6374 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.032000 audit[6377]: CRED_ACQ pid=6377 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.922257 sshd[6374]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:40.922000 audit[6374]: USER_END pid=6374 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.922000 audit[6374]: CRED_DISP pid=6374 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.924269 systemd[1]: sshd@105-145.40.67.89:22-139.178.68.195:50084.service: Deactivated successfully. Feb 13 08:41:40.923000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-145.40.67.89:22-139.178.68.195:50084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:40.924671 systemd[1]: session-84.scope: Deactivated successfully. Feb 13 08:41:40.925096 systemd-logind[1463]: Session 84 logged out. Waiting for processes to exit. Feb 13 08:41:40.925753 systemd[1]: Started sshd@106-145.40.67.89:22-139.178.68.195:50094.service. Feb 13 08:41:40.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.67.89:22-139.178.68.195:50094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:40.926322 systemd-logind[1463]: Removed session 84. Feb 13 08:41:40.931000 audit[6429]: NETFILTER_CFG table=filter:111 family=2 entries=24 op=nft_register_rule pid=6429 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:41:40.931000 audit[6429]: SYSCALL arch=c000003e syscall=46 success=yes exit=12476 a0=3 a1=7ffe5aeeee90 a2=0 a3=7ffe5aeeee7c items=0 ppid=2919 pid=6429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:40.931000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:41:40.932000 audit[6429]: NETFILTER_CFG table=nat:112 family=2 entries=30 op=nft_register_rule pid=6429 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:41:40.932000 audit[6429]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffe5aeeee90 a2=0 a3=31030 items=0 ppid=2919 pid=6429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:40.932000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:41:40.952000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:40.952000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00216ed60 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:41:40.952000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:41:40.952000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:40.952000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c001127860 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:41:40.952000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:41:40.952000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:40.952000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c00216ed80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:41:40.952000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:41:40.952000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:41:40.952000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002399500 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:41:40.952000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:41:40.961000 audit[6422]: USER_ACCT pid=6422 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.962096 sshd[6422]: Accepted publickey for core from 139.178.68.195 port 50094 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:40.961000 audit[6422]: CRED_ACQ pid=6422 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.961000 audit[6422]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd14abe600 a2=3 a3=0 items=0 ppid=1 pid=6422 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:40.961000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:40.962981 sshd[6422]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:40.962000 audit[6457]: NETFILTER_CFG table=filter:113 family=2 entries=36 op=nft_register_rule pid=6457 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:41:40.962000 audit[6457]: SYSCALL arch=c000003e syscall=46 success=yes exit=12476 a0=3 a1=7ffdb82d6000 a2=0 a3=7ffdb82d5fec items=0 ppid=2919 pid=6457 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:40.962000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:41:40.965956 systemd-logind[1463]: New session 85 of user core. Feb 13 08:41:40.966704 systemd[1]: Started session-85.scope. Feb 13 08:41:40.968000 audit[6422]: USER_START pid=6422 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.969000 audit[6458]: CRED_ACQ pid=6458 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:40.963000 audit[6457]: NETFILTER_CFG table=nat:114 family=2 entries=30 op=nft_register_rule pid=6457 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Feb 13 08:41:40.963000 audit[6457]: SYSCALL arch=c000003e syscall=46 success=yes exit=8836 a0=3 a1=7ffdb82d6000 a2=0 a3=31030 items=0 ppid=2919 pid=6457 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:40.963000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Feb 13 08:41:41.159741 sshd[6422]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:41.159000 audit[6422]: USER_END pid=6422 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.159000 audit[6422]: CRED_DISP pid=6422 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.161484 systemd[1]: sshd@106-145.40.67.89:22-139.178.68.195:50094.service: Deactivated successfully. Feb 13 08:41:41.160000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-145.40.67.89:22-139.178.68.195:50094 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:41.161815 systemd[1]: session-85.scope: Deactivated successfully. Feb 13 08:41:41.162155 systemd-logind[1463]: Session 85 logged out. Waiting for processes to exit. Feb 13 08:41:41.162763 systemd[1]: Started sshd@107-145.40.67.89:22-139.178.68.195:50100.service. Feb 13 08:41:41.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.67.89:22-139.178.68.195:50100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:41.163160 systemd-logind[1463]: Removed session 85. Feb 13 08:41:41.196000 audit[6480]: USER_ACCT pid=6480 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.197535 sshd[6480]: Accepted publickey for core from 139.178.68.195 port 50100 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:41.197000 audit[6480]: CRED_ACQ pid=6480 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.197000 audit[6480]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff36ac0050 a2=3 a3=0 items=0 ppid=1 pid=6480 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:41.197000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:41.198265 sshd[6480]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:41.201003 systemd-logind[1463]: New session 86 of user core. Feb 13 08:41:41.201721 systemd[1]: Started session-86.scope. Feb 13 08:41:41.203000 audit[6480]: USER_START pid=6480 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.204000 audit[6482]: CRED_ACQ pid=6482 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.276723 systemd[1]: Created slice kubepods-besteffort-pod767512d8_ec8b_4a84_be29_2de84e2dbb6e.slice. Feb 13 08:41:41.281760 env[1475]: time="2024-02-13T08:41:41.281635590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8wh4k,Uid:767512d8-ec8b-4a84-be29-2de84e2dbb6e,Namespace:calico-system,Attempt:0,}" Feb 13 08:41:41.336547 env[1475]: time="2024-02-13T08:41:41.336492133Z" level=error msg="Failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:41.336907 env[1475]: time="2024-02-13T08:41:41.336809320Z" level=error msg="encountered an error cleaning up failed sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:41.336907 env[1475]: time="2024-02-13T08:41:41.336859125Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8wh4k,Uid:767512d8-ec8b-4a84-be29-2de84e2dbb6e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:41.337125 kubelet[2614]: E0213 08:41:41.337082 2614 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:41.337321 kubelet[2614]: E0213 08:41:41.337128 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:41:41.337321 kubelet[2614]: E0213 08:41:41.337148 2614 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8wh4k" Feb 13 08:41:41.337321 kubelet[2614]: E0213 08:41:41.337192 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8wh4k_calico-system(767512d8-ec8b-4a84-be29-2de84e2dbb6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8wh4k_calico-system(767512d8-ec8b-4a84-be29-2de84e2dbb6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:41.338154 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9-shm.mount: Deactivated successfully. Feb 13 08:41:41.338690 sshd[6480]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:41.338000 audit[6480]: USER_END pid=6480 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.338000 audit[6480]: CRED_DISP pid=6480 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:41.340355 systemd[1]: sshd@107-145.40.67.89:22-139.178.68.195:50100.service: Deactivated successfully. Feb 13 08:41:41.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-145.40.67.89:22-139.178.68.195:50100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:41.340834 systemd[1]: session-86.scope: Deactivated successfully. Feb 13 08:41:41.341332 systemd-logind[1463]: Session 86 logged out. Waiting for processes to exit. Feb 13 08:41:41.341848 systemd-logind[1463]: Removed session 86. Feb 13 08:41:41.995994 kubelet[2614]: I0213 08:41:41.995913 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:41:41.997468 env[1475]: time="2024-02-13T08:41:41.997339559Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:41:42.046771 env[1475]: time="2024-02-13T08:41:42.046685441Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:42.047003 kubelet[2614]: E0213 08:41:42.046960 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:41:42.047103 kubelet[2614]: E0213 08:41:42.047024 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:41:42.047103 kubelet[2614]: E0213 08:41:42.047069 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:42.047233 kubelet[2614]: E0213 08:41:42.047108 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:44.021116 systemd[1]: Started sshd@108-145.40.67.89:22-61.83.148.111:45536.service. Feb 13 08:41:44.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.67.89:22-61.83.148.111:45536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:44.048211 kernel: kauditd_printk_skb: 70 callbacks suppressed Feb 13 08:41:44.048282 kernel: audit: type=1130 audit(1707813704.020:2080): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.67.89:22-61.83.148.111:45536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:44.915964 sshd[6582]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=61.83.148.111 user=root Feb 13 08:41:44.915000 audit[6582]: USER_AUTH pid=6582 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=61.83.148.111 addr=61.83.148.111 terminal=ssh res=failed' Feb 13 08:41:45.006071 kernel: audit: type=1100 audit(1707813704.915:2081): pid=6582 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=61.83.148.111 addr=61.83.148.111 terminal=ssh res=failed' Feb 13 08:41:45.293599 kubelet[2614]: I0213 08:41:45.293536 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:41:45.294016 kubelet[2614]: I0213 08:41:45.294005 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:41:45.294201 kubelet[2614]: I0213 08:41:45.294192 2614 topology_manager.go:210] "Topology Admit Handler" Feb 13 08:41:45.296449 systemd[1]: Created slice kubepods-besteffort-pod59c850d0_0181_4283_967e_a70a4b1b7e64.slice. Feb 13 08:41:45.298649 systemd[1]: Created slice kubepods-burstable-pod8659a212_b7b0_442b_a180_caeaa9464f84.slice. Feb 13 08:41:45.300896 systemd[1]: Created slice kubepods-burstable-pod1dbcd21d_393a_43bf_9e25_9f59bc66daab.slice. Feb 13 08:41:45.391351 kubelet[2614]: I0213 08:41:45.391285 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c850d0-0181-4283-967e-a70a4b1b7e64-tigera-ca-bundle\") pod \"calico-kube-controllers-5b75d6c9c-m8j85\" (UID: \"59c850d0-0181-4283-967e-a70a4b1b7e64\") " pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" Feb 13 08:41:45.391736 kubelet[2614]: I0213 08:41:45.391405 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkkqc\" (UniqueName: \"kubernetes.io/projected/59c850d0-0181-4283-967e-a70a4b1b7e64-kube-api-access-nkkqc\") pod \"calico-kube-controllers-5b75d6c9c-m8j85\" (UID: \"59c850d0-0181-4283-967e-a70a4b1b7e64\") " pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" Feb 13 08:41:45.391736 kubelet[2614]: I0213 08:41:45.391603 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbcd21d-393a-43bf-9e25-9f59bc66daab-config-volume\") pod \"coredns-787d4945fb-2shjr\" (UID: \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\") " pod="kube-system/coredns-787d4945fb-2shjr" Feb 13 08:41:45.392107 kubelet[2614]: I0213 08:41:45.391742 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfpbv\" (UniqueName: \"kubernetes.io/projected/1dbcd21d-393a-43bf-9e25-9f59bc66daab-kube-api-access-bfpbv\") pod \"coredns-787d4945fb-2shjr\" (UID: \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\") " pod="kube-system/coredns-787d4945fb-2shjr" Feb 13 08:41:45.392107 kubelet[2614]: I0213 08:41:45.391908 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8659a212-b7b0-442b-a180-caeaa9464f84-config-volume\") pod \"coredns-787d4945fb-9lrbl\" (UID: \"8659a212-b7b0-442b-a180-caeaa9464f84\") " pod="kube-system/coredns-787d4945fb-9lrbl" Feb 13 08:41:45.392107 kubelet[2614]: I0213 08:41:45.392005 2614 reconciler_common.go:253] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dntdf\" (UniqueName: \"kubernetes.io/projected/8659a212-b7b0-442b-a180-caeaa9464f84-kube-api-access-dntdf\") pod \"coredns-787d4945fb-9lrbl\" (UID: \"8659a212-b7b0-442b-a180-caeaa9464f84\") " pod="kube-system/coredns-787d4945fb-9lrbl" Feb 13 08:41:45.599088 env[1475]: time="2024-02-13T08:41:45.598894205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b75d6c9c-m8j85,Uid:59c850d0-0181-4283-967e-a70a4b1b7e64,Namespace:calico-system,Attempt:0,}" Feb 13 08:41:45.600875 env[1475]: time="2024-02-13T08:41:45.600806830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-9lrbl,Uid:8659a212-b7b0-442b-a180-caeaa9464f84,Namespace:kube-system,Attempt:0,}" Feb 13 08:41:45.602800 env[1475]: time="2024-02-13T08:41:45.602709273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-2shjr,Uid:1dbcd21d-393a-43bf-9e25-9f59bc66daab,Namespace:kube-system,Attempt:0,}" Feb 13 08:41:45.672773 env[1475]: time="2024-02-13T08:41:45.672729646Z" level=error msg="Failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.672889 env[1475]: time="2024-02-13T08:41:45.672730161Z" level=error msg="Failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673053 env[1475]: time="2024-02-13T08:41:45.673004702Z" level=error msg="encountered an error cleaning up failed sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673053 env[1475]: time="2024-02-13T08:41:45.673024543Z" level=error msg="encountered an error cleaning up failed sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673124 env[1475]: time="2024-02-13T08:41:45.673044349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-9lrbl,Uid:8659a212-b7b0-442b-a180-caeaa9464f84,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673124 env[1475]: time="2024-02-13T08:41:45.673066154Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b75d6c9c-m8j85,Uid:59c850d0-0181-4283-967e-a70a4b1b7e64,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673224 kubelet[2614]: E0213 08:41:45.673206 2614 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673224 kubelet[2614]: E0213 08:41:45.673216 2614 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.673306 kubelet[2614]: E0213 08:41:45.673244 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-9lrbl" Feb 13 08:41:45.673306 kubelet[2614]: E0213 08:41:45.673244 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" Feb 13 08:41:45.673306 kubelet[2614]: E0213 08:41:45.673262 2614 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-9lrbl" Feb 13 08:41:45.673306 kubelet[2614]: E0213 08:41:45.673262 2614 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" Feb 13 08:41:45.673403 kubelet[2614]: E0213 08:41:45.673310 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-9lrbl_kube-system(8659a212-b7b0-442b-a180-caeaa9464f84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-9lrbl_kube-system(8659a212-b7b0-442b-a180-caeaa9464f84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:41:45.673403 kubelet[2614]: E0213 08:41:45.673310 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5b75d6c9c-m8j85_calico-system(59c850d0-0181-4283-967e-a70a4b1b7e64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5b75d6c9c-m8j85_calico-system(59c850d0-0181-4283-967e-a70a4b1b7e64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:41:45.674269 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37-shm.mount: Deactivated successfully. Feb 13 08:41:45.674340 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4-shm.mount: Deactivated successfully. Feb 13 08:41:45.677083 env[1475]: time="2024-02-13T08:41:45.677055146Z" level=error msg="Failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.677253 env[1475]: time="2024-02-13T08:41:45.677237080Z" level=error msg="encountered an error cleaning up failed sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.677285 env[1475]: time="2024-02-13T08:41:45.677266214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-787d4945fb-2shjr,Uid:1dbcd21d-393a-43bf-9e25-9f59bc66daab,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.677371 kubelet[2614]: E0213 08:41:45.677362 2614 remote_runtime.go:176] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:45.677404 kubelet[2614]: E0213 08:41:45.677390 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-2shjr" Feb 13 08:41:45.677404 kubelet[2614]: E0213 08:41:45.677404 2614 kuberuntime_manager.go:782] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-787d4945fb-2shjr" Feb 13 08:41:45.677491 kubelet[2614]: E0213 08:41:45.677430 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-787d4945fb-2shjr_kube-system(1dbcd21d-393a-43bf-9e25-9f59bc66daab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-787d4945fb-2shjr_kube-system(1dbcd21d-393a-43bf-9e25-9f59bc66daab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:41:46.006285 kubelet[2614]: I0213 08:41:46.006201 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:41:46.007524 env[1475]: time="2024-02-13T08:41:46.007413960Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:41:46.008290 kubelet[2614]: I0213 08:41:46.008245 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:41:46.009469 env[1475]: time="2024-02-13T08:41:46.009352280Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:41:46.010513 kubelet[2614]: I0213 08:41:46.010436 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:41:46.011576 env[1475]: time="2024-02-13T08:41:46.011472484Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:41:46.044069 sshd[6582]: Failed password for root from 61.83.148.111 port 45536 ssh2 Feb 13 08:41:46.056189 env[1475]: time="2024-02-13T08:41:46.056140360Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:46.056378 kubelet[2614]: E0213 08:41:46.056360 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:41:46.056452 kubelet[2614]: E0213 08:41:46.056397 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:41:46.056452 kubelet[2614]: E0213 08:41:46.056433 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:46.056582 env[1475]: time="2024-02-13T08:41:46.056413662Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:46.056646 kubelet[2614]: E0213 08:41:46.056482 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:41:46.056646 kubelet[2614]: E0213 08:41:46.056554 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:41:46.056646 kubelet[2614]: E0213 08:41:46.056583 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:41:46.056646 kubelet[2614]: E0213 08:41:46.056616 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:46.056868 kubelet[2614]: E0213 08:41:46.056643 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:41:46.056868 kubelet[2614]: E0213 08:41:46.056803 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:41:46.056868 kubelet[2614]: E0213 08:41:46.056819 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:41:46.056868 kubelet[2614]: E0213 08:41:46.056850 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:46.057115 env[1475]: time="2024-02-13T08:41:46.056676547Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:46.057156 kubelet[2614]: E0213 08:41:46.056874 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:41:46.349654 systemd[1]: Started sshd@109-145.40.67.89:22-139.178.68.195:47732.service. Feb 13 08:41:46.349000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.67.89:22-139.178.68.195:47732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:46.441980 kernel: audit: type=1130 audit(1707813706.349:2082): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.67.89:22-139.178.68.195:47732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:46.469000 audit[6826]: USER_ACCT pid=6826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.471060 sshd[6826]: Accepted publickey for core from 139.178.68.195 port 47732 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:46.474288 sshd[6826]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:46.484995 systemd-logind[1463]: New session 87 of user core. Feb 13 08:41:46.487572 systemd[1]: Started session-87.scope. Feb 13 08:41:46.516860 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a-shm.mount: Deactivated successfully. Feb 13 08:41:46.472000 audit[6826]: CRED_ACQ pid=6826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.573414 sshd[6826]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:46.574935 systemd[1]: sshd@109-145.40.67.89:22-139.178.68.195:47732.service: Deactivated successfully. Feb 13 08:41:46.575359 systemd[1]: session-87.scope: Deactivated successfully. Feb 13 08:41:46.575755 systemd-logind[1463]: Session 87 logged out. Waiting for processes to exit. Feb 13 08:41:46.576254 systemd-logind[1463]: Removed session 87. Feb 13 08:41:46.633894 sshd[6582]: Received disconnect from 61.83.148.111 port 45536:11: Bye Bye [preauth] Feb 13 08:41:46.633894 sshd[6582]: Disconnected from authenticating user root 61.83.148.111 port 45536 [preauth] Feb 13 08:41:46.634341 systemd[1]: sshd@108-145.40.67.89:22-61.83.148.111:45536.service: Deactivated successfully. Feb 13 08:41:46.654114 kernel: audit: type=1101 audit(1707813706.469:2083): pid=6826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.654151 kernel: audit: type=1103 audit(1707813706.472:2084): pid=6826 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.654165 kernel: audit: type=1006 audit(1707813706.472:2085): pid=6826 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=87 res=1 Feb 13 08:41:46.712772 kernel: audit: type=1300 audit(1707813706.472:2085): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa0d0fbd0 a2=3 a3=0 items=0 ppid=1 pid=6826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:46.472000 audit[6826]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa0d0fbd0 a2=3 a3=0 items=0 ppid=1 pid=6826 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:46.805410 kernel: audit: type=1327 audit(1707813706.472:2085): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:46.472000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:46.836359 kernel: audit: type=1105 audit(1707813706.497:2086): pid=6826 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.497000 audit[6826]: USER_START pid=6826 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.931873 kernel: audit: type=1103 audit(1707813706.499:2087): pid=6828 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.499000 audit[6828]: CRED_ACQ pid=6828 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.573000 audit[6826]: USER_END pid=6826 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.573000 audit[6826]: CRED_DISP pid=6826 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:46.574000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-145.40.67.89:22-139.178.68.195:47732 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:46.633000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-145.40.67.89:22-61.83.148.111:45536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:51.582782 systemd[1]: Started sshd@110-145.40.67.89:22-139.178.68.195:47746.service. Feb 13 08:41:51.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.67.89:22-139.178.68.195:47746 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:51.609528 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:41:51.609605 kernel: audit: type=1130 audit(1707813711.581:2092): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.67.89:22-139.178.68.195:47746 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:51.727000 audit[6851]: USER_ACCT pid=6851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.728424 sshd[6851]: Accepted publickey for core from 139.178.68.195 port 47746 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:51.729289 sshd[6851]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:51.732223 systemd-logind[1463]: New session 88 of user core. Feb 13 08:41:51.732773 systemd[1]: Started session-88.scope. Feb 13 08:41:51.812762 sshd[6851]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:51.814409 systemd[1]: sshd@110-145.40.67.89:22-139.178.68.195:47746.service: Deactivated successfully. Feb 13 08:41:51.814841 systemd[1]: session-88.scope: Deactivated successfully. Feb 13 08:41:51.815352 systemd-logind[1463]: Session 88 logged out. Waiting for processes to exit. Feb 13 08:41:51.816018 systemd-logind[1463]: Removed session 88. Feb 13 08:41:51.728000 audit[6851]: CRED_ACQ pid=6851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.911747 kernel: audit: type=1101 audit(1707813711.727:2093): pid=6851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.911782 kernel: audit: type=1103 audit(1707813711.728:2094): pid=6851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.911798 kernel: audit: type=1006 audit(1707813711.728:2095): pid=6851 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=88 res=1 Feb 13 08:41:51.970354 kernel: audit: type=1300 audit(1707813711.728:2095): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe476c2ca0 a2=3 a3=0 items=0 ppid=1 pid=6851 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:51.728000 audit[6851]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe476c2ca0 a2=3 a3=0 items=0 ppid=1 pid=6851 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:52.062389 kernel: audit: type=1327 audit(1707813711.728:2095): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:51.728000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:52.092909 kernel: audit: type=1105 audit(1707813711.735:2096): pid=6851 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.735000 audit[6851]: USER_START pid=6851 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:52.187586 kernel: audit: type=1103 audit(1707813711.736:2097): pid=6853 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.736000 audit[6853]: CRED_ACQ pid=6853 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:52.264853 env[1475]: time="2024-02-13T08:41:52.264812371Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:41:52.276490 env[1475]: time="2024-02-13T08:41:52.276454539Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:52.276689 kubelet[2614]: E0213 08:41:52.276652 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:41:52.276689 kubelet[2614]: E0213 08:41:52.276681 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:41:52.276896 kernel: audit: type=1106 audit(1707813711.812:2098): pid=6851 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.812000 audit[6851]: USER_END pid=6851 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:52.276971 kubelet[2614]: E0213 08:41:52.276709 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:52.276971 kubelet[2614]: E0213 08:41:52.276729 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:41:52.372405 kernel: audit: type=1104 audit(1707813711.812:2099): pid=6851 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.812000 audit[6851]: CRED_DISP pid=6851 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:51.813000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-145.40.67.89:22-139.178.68.195:47746 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:56.816295 systemd[1]: Started sshd@111-145.40.67.89:22-139.178.68.195:60442.service. Feb 13 08:41:56.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.67.89:22-139.178.68.195:60442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:56.843418 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:41:56.843489 kernel: audit: type=1130 audit(1707813716.815:2101): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.67.89:22-139.178.68.195:60442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:56.960000 audit[6907]: USER_ACCT pid=6907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:56.962061 sshd[6907]: Accepted publickey for core from 139.178.68.195 port 60442 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:41:56.963233 sshd[6907]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:41:56.966025 systemd-logind[1463]: New session 89 of user core. Feb 13 08:41:56.966581 systemd[1]: Started session-89.scope. Feb 13 08:41:57.046180 sshd[6907]: pam_unix(sshd:session): session closed for user core Feb 13 08:41:57.047725 systemd[1]: sshd@111-145.40.67.89:22-139.178.68.195:60442.service: Deactivated successfully. Feb 13 08:41:57.048350 systemd[1]: session-89.scope: Deactivated successfully. Feb 13 08:41:57.048728 systemd-logind[1463]: Session 89 logged out. Waiting for processes to exit. Feb 13 08:41:57.049242 systemd-logind[1463]: Removed session 89. Feb 13 08:41:56.962000 audit[6907]: CRED_ACQ pid=6907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.145952 kernel: audit: type=1101 audit(1707813716.960:2102): pid=6907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.145991 kernel: audit: type=1103 audit(1707813716.962:2103): pid=6907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.146010 kernel: audit: type=1006 audit(1707813716.962:2104): pid=6907 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=89 res=1 Feb 13 08:41:57.204015 kernel: audit: type=1300 audit(1707813716.962:2104): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff23508760 a2=3 a3=0 items=0 ppid=1 pid=6907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:56.962000 audit[6907]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff23508760 a2=3 a3=0 items=0 ppid=1 pid=6907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:41:57.265373 env[1475]: time="2024-02-13T08:41:57.265333827Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:41:57.277002 env[1475]: time="2024-02-13T08:41:57.276920443Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:57.277090 kubelet[2614]: E0213 08:41:57.277079 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:41:57.277255 kubelet[2614]: E0213 08:41:57.277104 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:41:57.277255 kubelet[2614]: E0213 08:41:57.277127 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:57.277255 kubelet[2614]: E0213 08:41:57.277147 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:41:57.296094 kernel: audit: type=1327 audit(1707813716.962:2104): proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:56.962000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:41:57.326621 kernel: audit: type=1105 audit(1707813716.968:2105): pid=6907 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:56.968000 audit[6907]: USER_START pid=6907 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.421169 kernel: audit: type=1103 audit(1707813716.969:2106): pid=6909 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:56.969000 audit[6909]: CRED_ACQ pid=6909 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.510438 kernel: audit: type=1106 audit(1707813717.046:2107): pid=6907 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.046000 audit[6907]: USER_END pid=6907 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.606012 kernel: audit: type=1104 audit(1707813717.046:2108): pid=6907 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.046000 audit[6907]: CRED_DISP pid=6907 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:41:57.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-145.40.67.89:22-139.178.68.195:60442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:41:59.264784 env[1475]: time="2024-02-13T08:41:59.264754236Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:41:59.277717 env[1475]: time="2024-02-13T08:41:59.277652895Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:41:59.277849 kubelet[2614]: E0213 08:41:59.277837 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:41:59.278039 kubelet[2614]: E0213 08:41:59.277866 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:41:59.278039 kubelet[2614]: E0213 08:41:59.277892 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:41:59.278039 kubelet[2614]: E0213 08:41:59.277913 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:42:01.264715 env[1475]: time="2024-02-13T08:42:01.264642087Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:42:01.277577 env[1475]: time="2024-02-13T08:42:01.277535253Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:01.277722 kubelet[2614]: E0213 08:42:01.277710 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:42:01.277896 kubelet[2614]: E0213 08:42:01.277737 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:42:01.277896 kubelet[2614]: E0213 08:42:01.277763 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:01.277896 kubelet[2614]: E0213 08:42:01.277785 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:42:02.055992 systemd[1]: Started sshd@112-145.40.67.89:22-139.178.68.195:60446.service. Feb 13 08:42:02.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.67.89:22-139.178.68.195:60446 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:02.082688 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:02.082754 kernel: audit: type=1130 audit(1707813722.055:2110): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.67.89:22-139.178.68.195:60446 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:02.199000 audit[7016]: USER_ACCT pid=7016 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.200663 sshd[7016]: Accepted publickey for core from 139.178.68.195 port 60446 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:02.203701 sshd[7016]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:02.208348 systemd-logind[1463]: New session 90 of user core. Feb 13 08:42:02.208852 systemd[1]: Started session-90.scope. Feb 13 08:42:02.289869 sshd[7016]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:02.291583 systemd[1]: sshd@112-145.40.67.89:22-139.178.68.195:60446.service: Deactivated successfully. Feb 13 08:42:02.292118 systemd[1]: session-90.scope: Deactivated successfully. Feb 13 08:42:02.292672 systemd-logind[1463]: Session 90 logged out. Waiting for processes to exit. Feb 13 08:42:02.293365 systemd-logind[1463]: Removed session 90. Feb 13 08:42:02.202000 audit[7016]: CRED_ACQ pid=7016 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.385632 kernel: audit: type=1101 audit(1707813722.199:2111): pid=7016 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.385672 kernel: audit: type=1103 audit(1707813722.202:2112): pid=7016 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.385689 kernel: audit: type=1006 audit(1707813722.202:2113): pid=7016 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=90 res=1 Feb 13 08:42:02.444205 kernel: audit: type=1300 audit(1707813722.202:2113): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2e387510 a2=3 a3=0 items=0 ppid=1 pid=7016 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:02.202000 audit[7016]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2e387510 a2=3 a3=0 items=0 ppid=1 pid=7016 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:02.536256 kernel: audit: type=1327 audit(1707813722.202:2113): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:02.202000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:02.566759 kernel: audit: type=1105 audit(1707813722.210:2114): pid=7016 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.210000 audit[7016]: USER_START pid=7016 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.211000 audit[7018]: CRED_ACQ pid=7018 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.750520 kernel: audit: type=1103 audit(1707813722.211:2115): pid=7018 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.750587 kernel: audit: type=1106 audit(1707813722.289:2116): pid=7016 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.289000 audit[7016]: USER_END pid=7016 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.289000 audit[7016]: CRED_DISP pid=7016 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.935537 kernel: audit: type=1104 audit(1707813722.289:2117): pid=7016 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:02.290000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-145.40.67.89:22-139.178.68.195:60446 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:03.264684 env[1475]: time="2024-02-13T08:42:03.264630801Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:42:03.277792 env[1475]: time="2024-02-13T08:42:03.277730288Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:03.277922 kubelet[2614]: E0213 08:42:03.277909 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:42:03.278114 kubelet[2614]: E0213 08:42:03.277951 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:42:03.278114 kubelet[2614]: E0213 08:42:03.277980 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:03.278114 kubelet[2614]: E0213 08:42:03.278002 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:42:07.301119 systemd[1]: Started sshd@113-145.40.67.89:22-139.178.68.195:45902.service. Feb 13 08:42:07.300000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.67.89:22-139.178.68.195:45902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:07.333554 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:07.333622 kernel: audit: type=1130 audit(1707813727.300:2119): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.67.89:22-139.178.68.195:45902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:07.450000 audit[7070]: USER_ACCT pid=7070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.451432 sshd[7070]: Accepted publickey for core from 139.178.68.195 port 45902 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:07.452215 sshd[7070]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:07.454574 systemd-logind[1463]: New session 91 of user core. Feb 13 08:42:07.455021 systemd[1]: Started session-91.scope. Feb 13 08:42:07.468853 systemd[1]: Started sshd@114-145.40.67.89:22-161.35.108.241:59790.service. Feb 13 08:42:07.532707 sshd[7070]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:07.534166 systemd[1]: sshd@113-145.40.67.89:22-139.178.68.195:45902.service: Deactivated successfully. Feb 13 08:42:07.534617 systemd[1]: session-91.scope: Deactivated successfully. Feb 13 08:42:07.534911 systemd-logind[1463]: Session 91 logged out. Waiting for processes to exit. Feb 13 08:42:07.535448 systemd-logind[1463]: Removed session 91. Feb 13 08:42:07.451000 audit[7070]: CRED_ACQ pid=7070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.633313 kernel: audit: type=1101 audit(1707813727.450:2120): pid=7070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.633402 kernel: audit: type=1103 audit(1707813727.451:2121): pid=7070 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.633420 kernel: audit: type=1006 audit(1707813727.451:2122): pid=7070 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=91 res=1 Feb 13 08:42:07.691960 kernel: audit: type=1300 audit(1707813727.451:2122): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd23e88df0 a2=3 a3=0 items=0 ppid=1 pid=7070 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:07.451000 audit[7070]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd23e88df0 a2=3 a3=0 items=0 ppid=1 pid=7070 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:07.784011 kernel: audit: type=1327 audit(1707813727.451:2122): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:07.451000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:07.814567 kernel: audit: type=1105 audit(1707813727.456:2123): pid=7070 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.456000 audit[7070]: USER_START pid=7070 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.456000 audit[7072]: CRED_ACQ pid=7072 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.998462 kernel: audit: type=1103 audit(1707813727.456:2124): pid=7072 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.998539 kernel: audit: type=1130 audit(1707813727.467:2125): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.67.89:22-161.35.108.241:59790 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:07.467000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.67.89:22-161.35.108.241:59790 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:08.045893 sshd[7074]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:42:07.532000 audit[7070]: USER_END pid=7070 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:08.183009 kernel: audit: type=1106 audit(1707813727.532:2126): pid=7070 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.532000 audit[7070]: CRED_DISP pid=7070 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:07.533000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-145.40.67.89:22-139.178.68.195:45902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:08.044000 audit[7074]: USER_AUTH pid=7074 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:42:09.134318 sshd[7074]: Failed password for root from 161.35.108.241 port 59790 ssh2 Feb 13 08:42:09.543467 sshd[7074]: Received disconnect from 161.35.108.241 port 59790:11: Bye Bye [preauth] Feb 13 08:42:09.543467 sshd[7074]: Disconnected from authenticating user root 161.35.108.241 port 59790 [preauth] Feb 13 08:42:09.546155 systemd[1]: sshd@114-145.40.67.89:22-161.35.108.241:59790.service: Deactivated successfully. Feb 13 08:42:09.545000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-145.40.67.89:22-161.35.108.241:59790 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:09.781734 systemd[1]: Started sshd@115-145.40.67.89:22-43.153.15.221:51430.service. Feb 13 08:42:09.781000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.67.89:22-43.153.15.221:51430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:09.906558 sshd[7098]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:42:09.905000 audit[7098]: USER_AUTH pid=7098 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:42:11.466815 sshd[7098]: Failed password for root from 43.153.15.221 port 51430 ssh2 Feb 13 08:42:12.264716 env[1475]: time="2024-02-13T08:42:12.264660133Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:42:12.278163 env[1475]: time="2024-02-13T08:42:12.278099476Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:12.278289 kubelet[2614]: E0213 08:42:12.278277 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:42:12.278490 kubelet[2614]: E0213 08:42:12.278304 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:42:12.278490 kubelet[2614]: E0213 08:42:12.278329 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:12.278490 kubelet[2614]: E0213 08:42:12.278349 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:42:12.541730 systemd[1]: Started sshd@116-145.40.67.89:22-139.178.68.195:45904.service. Feb 13 08:42:12.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-145.40.67.89:22-139.178.68.195:45904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:12.568866 kernel: kauditd_printk_skb: 6 callbacks suppressed Feb 13 08:42:12.568904 kernel: audit: type=1130 audit(1707813732.540:2133): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-145.40.67.89:22-139.178.68.195:45904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:12.687345 sshd[7128]: Accepted publickey for core from 139.178.68.195 port 45904 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:12.686000 audit[7128]: USER_ACCT pid=7128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.689218 sshd[7128]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:12.691619 systemd-logind[1463]: New session 92 of user core. Feb 13 08:42:12.692196 systemd[1]: Started session-92.scope. Feb 13 08:42:12.769394 sshd[7128]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:12.770707 systemd[1]: sshd@116-145.40.67.89:22-139.178.68.195:45904.service: Deactivated successfully. Feb 13 08:42:12.771132 systemd[1]: session-92.scope: Deactivated successfully. Feb 13 08:42:12.771510 systemd-logind[1463]: Session 92 logged out. Waiting for processes to exit. Feb 13 08:42:12.771918 systemd-logind[1463]: Removed session 92. Feb 13 08:42:12.772039 sshd[7098]: Received disconnect from 43.153.15.221 port 51430:11: Bye Bye [preauth] Feb 13 08:42:12.772039 sshd[7098]: Disconnected from authenticating user root 43.153.15.221 port 51430 [preauth] Feb 13 08:42:12.772585 systemd[1]: sshd@115-145.40.67.89:22-43.153.15.221:51430.service: Deactivated successfully. Feb 13 08:42:12.688000 audit[7128]: CRED_ACQ pid=7128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.870334 kernel: audit: type=1101 audit(1707813732.686:2134): pid=7128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.870376 kernel: audit: type=1103 audit(1707813732.688:2135): pid=7128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.870392 kernel: audit: type=1006 audit(1707813732.688:2136): pid=7128 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=92 res=1 Feb 13 08:42:12.929290 kernel: audit: type=1300 audit(1707813732.688:2136): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd927895e0 a2=3 a3=0 items=0 ppid=1 pid=7128 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:12.688000 audit[7128]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd927895e0 a2=3 a3=0 items=0 ppid=1 pid=7128 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:13.021821 kernel: audit: type=1327 audit(1707813732.688:2136): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:12.688000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:13.052519 kernel: audit: type=1105 audit(1707813732.693:2137): pid=7128 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.693000 audit[7128]: USER_START pid=7128 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:13.147598 kernel: audit: type=1103 audit(1707813732.694:2138): pid=7130 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.694000 audit[7130]: CRED_ACQ pid=7130 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:13.237500 kernel: audit: type=1106 audit(1707813732.769:2139): pid=7128 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.769000 audit[7128]: USER_END pid=7128 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:13.265270 env[1475]: time="2024-02-13T08:42:13.265205938Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:42:13.277431 env[1475]: time="2024-02-13T08:42:13.277363777Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:13.277572 kubelet[2614]: E0213 08:42:13.277528 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:42:13.277572 kubelet[2614]: E0213 08:42:13.277551 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:42:13.277572 kubelet[2614]: E0213 08:42:13.277573 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:13.277696 kubelet[2614]: E0213 08:42:13.277590 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:42:12.769000 audit[7128]: CRED_DISP pid=7128 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:13.423428 kernel: audit: type=1104 audit(1707813732.769:2140): pid=7128 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:12.769000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-145.40.67.89:22-139.178.68.195:45904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:12.771000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-145.40.67.89:22-43.153.15.221:51430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:16.265188 env[1475]: time="2024-02-13T08:42:16.265132100Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:42:16.265395 env[1475]: time="2024-02-13T08:42:16.265132055Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:42:16.278848 env[1475]: time="2024-02-13T08:42:16.278809684Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:16.279027 kubelet[2614]: E0213 08:42:16.278981 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:42:16.279027 kubelet[2614]: E0213 08:42:16.279016 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:42:16.279254 kubelet[2614]: E0213 08:42:16.279055 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:16.279254 kubelet[2614]: E0213 08:42:16.279086 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:42:16.279358 env[1475]: time="2024-02-13T08:42:16.279204753Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:16.279390 kubelet[2614]: E0213 08:42:16.279293 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:42:16.279390 kubelet[2614]: E0213 08:42:16.279310 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:42:16.279390 kubelet[2614]: E0213 08:42:16.279333 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:16.279390 kubelet[2614]: E0213 08:42:16.279351 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:42:17.779175 systemd[1]: Started sshd@117-145.40.67.89:22-139.178.68.195:51670.service. Feb 13 08:42:17.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-145.40.67.89:22-139.178.68.195:51670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:17.806341 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:42:17.806418 kernel: audit: type=1130 audit(1707813737.778:2143): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-145.40.67.89:22-139.178.68.195:51670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:17.925000 audit[7241]: USER_ACCT pid=7241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:17.926420 sshd[7241]: Accepted publickey for core from 139.178.68.195 port 51670 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:17.927211 sshd[7241]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:17.929471 systemd-logind[1463]: New session 93 of user core. Feb 13 08:42:17.930017 systemd[1]: Started session-93.scope. Feb 13 08:42:18.007489 sshd[7241]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:18.008978 systemd[1]: sshd@117-145.40.67.89:22-139.178.68.195:51670.service: Deactivated successfully. Feb 13 08:42:18.009456 systemd[1]: session-93.scope: Deactivated successfully. Feb 13 08:42:18.009775 systemd-logind[1463]: Session 93 logged out. Waiting for processes to exit. Feb 13 08:42:18.010338 systemd-logind[1463]: Removed session 93. Feb 13 08:42:17.926000 audit[7241]: CRED_ACQ pid=7241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.109243 kernel: audit: type=1101 audit(1707813737.925:2144): pid=7241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.109284 kernel: audit: type=1103 audit(1707813737.926:2145): pid=7241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.109300 kernel: audit: type=1006 audit(1707813737.926:2146): pid=7241 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=93 res=1 Feb 13 08:42:18.168136 kernel: audit: type=1300 audit(1707813737.926:2146): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff0039f250 a2=3 a3=0 items=0 ppid=1 pid=7241 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:17.926000 audit[7241]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff0039f250 a2=3 a3=0 items=0 ppid=1 pid=7241 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:18.260623 kernel: audit: type=1327 audit(1707813737.926:2146): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:17.926000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:18.291326 kernel: audit: type=1105 audit(1707813737.931:2147): pid=7241 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:17.931000 audit[7241]: USER_START pid=7241 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.386490 kernel: audit: type=1103 audit(1707813737.931:2148): pid=7243 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:17.931000 audit[7243]: CRED_ACQ pid=7243 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.475853 kernel: audit: type=1106 audit(1707813738.007:2149): pid=7241 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.007000 audit[7241]: USER_END pid=7241 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.571386 kernel: audit: type=1104 audit(1707813738.007:2150): pid=7241 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.007000 audit[7241]: CRED_DISP pid=7241 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:18.008000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-145.40.67.89:22-139.178.68.195:51670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:23.016688 systemd[1]: Started sshd@118-145.40.67.89:22-139.178.68.195:51676.service. Feb 13 08:42:23.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-145.40.67.89:22-139.178.68.195:51676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:23.043581 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:23.043625 kernel: audit: type=1130 audit(1707813743.015:2152): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-145.40.67.89:22-139.178.68.195:51676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:23.161000 audit[7266]: USER_ACCT pid=7266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.162404 sshd[7266]: Accepted publickey for core from 139.178.68.195 port 51676 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:23.164024 sshd[7266]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:23.166501 systemd-logind[1463]: New session 94 of user core. Feb 13 08:42:23.167052 systemd[1]: Started session-94.scope. Feb 13 08:42:23.247320 sshd[7266]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:23.248816 systemd[1]: sshd@118-145.40.67.89:22-139.178.68.195:51676.service: Deactivated successfully. Feb 13 08:42:23.249344 systemd[1]: session-94.scope: Deactivated successfully. Feb 13 08:42:23.249709 systemd-logind[1463]: Session 94 logged out. Waiting for processes to exit. Feb 13 08:42:23.250600 systemd-logind[1463]: Removed session 94. Feb 13 08:42:23.162000 audit[7266]: CRED_ACQ pid=7266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.346774 kernel: audit: type=1101 audit(1707813743.161:2153): pid=7266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.346847 kernel: audit: type=1103 audit(1707813743.162:2154): pid=7266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.346868 kernel: audit: type=1006 audit(1707813743.162:2155): pid=7266 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=94 res=1 Feb 13 08:42:23.405419 kernel: audit: type=1300 audit(1707813743.162:2155): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeeed49ae0 a2=3 a3=0 items=0 ppid=1 pid=7266 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:23.162000 audit[7266]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeeed49ae0 a2=3 a3=0 items=0 ppid=1 pid=7266 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:23.162000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:23.527957 kernel: audit: type=1327 audit(1707813743.162:2155): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:23.527985 kernel: audit: type=1105 audit(1707813743.169:2156): pid=7266 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.169000 audit[7266]: USER_START pid=7266 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.622456 kernel: audit: type=1103 audit(1707813743.169:2157): pid=7268 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.169000 audit[7268]: CRED_ACQ pid=7268 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.711674 kernel: audit: type=1106 audit(1707813743.247:2158): pid=7266 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.247000 audit[7266]: USER_END pid=7266 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.807185 kernel: audit: type=1104 audit(1707813743.247:2159): pid=7266 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.247000 audit[7266]: CRED_DISP pid=7266 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:23.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-145.40.67.89:22-139.178.68.195:51676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:25.264613 env[1475]: time="2024-02-13T08:42:25.264589644Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:42:25.264613 env[1475]: time="2024-02-13T08:42:25.264589587Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:42:25.278084 env[1475]: time="2024-02-13T08:42:25.278011387Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:25.278084 env[1475]: time="2024-02-13T08:42:25.278009267Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:25.278221 kubelet[2614]: E0213 08:42:25.278183 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:42:25.278221 kubelet[2614]: E0213 08:42:25.278207 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:42:25.278221 kubelet[2614]: E0213 08:42:25.278210 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:42:25.278475 kubelet[2614]: E0213 08:42:25.278228 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:42:25.278475 kubelet[2614]: E0213 08:42:25.278245 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:25.278475 kubelet[2614]: E0213 08:42:25.278255 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:25.278475 kubelet[2614]: E0213 08:42:25.278267 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:42:25.278644 kubelet[2614]: E0213 08:42:25.278276 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:42:27.265972 env[1475]: time="2024-02-13T08:42:27.265854232Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:42:27.292065 env[1475]: time="2024-02-13T08:42:27.291993349Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:27.292208 kubelet[2614]: E0213 08:42:27.292163 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:42:27.292208 kubelet[2614]: E0213 08:42:27.292187 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:42:27.292208 kubelet[2614]: E0213 08:42:27.292208 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:27.292452 kubelet[2614]: E0213 08:42:27.292226 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:42:28.256724 systemd[1]: Started sshd@119-145.40.67.89:22-139.178.68.195:59656.service. Feb 13 08:42:28.255000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-145.40.67.89:22-139.178.68.195:59656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:28.283845 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:28.283890 kernel: audit: type=1130 audit(1707813748.255:2161): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-145.40.67.89:22-139.178.68.195:59656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:28.402000 audit[7377]: USER_ACCT pid=7377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.403350 sshd[7377]: Accepted publickey for core from 139.178.68.195 port 59656 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:28.405519 sshd[7377]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:28.410106 systemd-logind[1463]: New session 95 of user core. Feb 13 08:42:28.411689 systemd[1]: Started session-95.scope. Feb 13 08:42:28.491387 sshd[7377]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:28.492714 systemd[1]: sshd@119-145.40.67.89:22-139.178.68.195:59656.service: Deactivated successfully. Feb 13 08:42:28.493180 systemd[1]: session-95.scope: Deactivated successfully. Feb 13 08:42:28.493530 systemd-logind[1463]: Session 95 logged out. Waiting for processes to exit. Feb 13 08:42:28.493929 systemd-logind[1463]: Removed session 95. Feb 13 08:42:28.404000 audit[7377]: CRED_ACQ pid=7377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.585062 kernel: audit: type=1101 audit(1707813748.402:2162): pid=7377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.585100 kernel: audit: type=1103 audit(1707813748.404:2163): pid=7377 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.585118 kernel: audit: type=1006 audit(1707813748.404:2164): pid=7377 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=95 res=1 Feb 13 08:42:28.643699 kernel: audit: type=1300 audit(1707813748.404:2164): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2cfb4890 a2=3 a3=0 items=0 ppid=1 pid=7377 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:28.404000 audit[7377]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2cfb4890 a2=3 a3=0 items=0 ppid=1 pid=7377 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:28.735719 kernel: audit: type=1327 audit(1707813748.404:2164): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:28.404000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:28.766224 kernel: audit: type=1105 audit(1707813748.416:2165): pid=7377 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.416000 audit[7377]: USER_START pid=7377 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.860755 kernel: audit: type=1103 audit(1707813748.417:2166): pid=7379 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.417000 audit[7379]: CRED_ACQ pid=7379 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.950014 kernel: audit: type=1106 audit(1707813748.491:2167): pid=7377 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.491000 audit[7377]: USER_END pid=7377 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:29.045460 kernel: audit: type=1104 audit(1707813748.491:2168): pid=7377 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.491000 audit[7377]: CRED_DISP pid=7377 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:28.491000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-145.40.67.89:22-139.178.68.195:59656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:31.265743 env[1475]: time="2024-02-13T08:42:31.265548327Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:42:31.294787 env[1475]: time="2024-02-13T08:42:31.294726223Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:31.294901 kubelet[2614]: E0213 08:42:31.294880 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:42:31.295083 kubelet[2614]: E0213 08:42:31.294904 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:42:31.295083 kubelet[2614]: E0213 08:42:31.294939 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:31.295083 kubelet[2614]: E0213 08:42:31.294973 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:42:33.504265 systemd[1]: Started sshd@120-145.40.67.89:22-139.178.68.195:59664.service. Feb 13 08:42:33.503000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-145.40.67.89:22-139.178.68.195:59664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:33.531266 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:33.531310 kernel: audit: type=1130 audit(1707813753.503:2170): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-145.40.67.89:22-139.178.68.195:59664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:33.647000 audit[7428]: USER_ACCT pid=7428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.648394 sshd[7428]: Accepted publickey for core from 139.178.68.195 port 59664 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:33.649214 sshd[7428]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:33.651396 systemd-logind[1463]: New session 96 of user core. Feb 13 08:42:33.652081 systemd[1]: Started session-96.scope. Feb 13 08:42:33.734200 sshd[7428]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:33.735618 systemd[1]: sshd@120-145.40.67.89:22-139.178.68.195:59664.service: Deactivated successfully. Feb 13 08:42:33.736073 systemd[1]: session-96.scope: Deactivated successfully. Feb 13 08:42:33.736417 systemd-logind[1463]: Session 96 logged out. Waiting for processes to exit. Feb 13 08:42:33.736814 systemd-logind[1463]: Removed session 96. Feb 13 08:42:33.648000 audit[7428]: CRED_ACQ pid=7428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.830329 kernel: audit: type=1101 audit(1707813753.647:2171): pid=7428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.830376 kernel: audit: type=1103 audit(1707813753.648:2172): pid=7428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.830399 kernel: audit: type=1006 audit(1707813753.648:2173): pid=7428 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=96 res=1 Feb 13 08:42:33.648000 audit[7428]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd00dc3a70 a2=3 a3=0 items=0 ppid=1 pid=7428 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:33.648000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:34.011542 kernel: audit: type=1300 audit(1707813753.648:2173): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd00dc3a70 a2=3 a3=0 items=0 ppid=1 pid=7428 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:34.011589 kernel: audit: type=1327 audit(1707813753.648:2173): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:34.011617 kernel: audit: type=1105 audit(1707813753.653:2174): pid=7428 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.653000 audit[7428]: USER_START pid=7428 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:34.106069 kernel: audit: type=1103 audit(1707813753.653:2175): pid=7430 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.653000 audit[7430]: CRED_ACQ pid=7430 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:34.195309 kernel: audit: type=1106 audit(1707813753.733:2176): pid=7428 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.733000 audit[7428]: USER_END pid=7428 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:34.290884 kernel: audit: type=1104 audit(1707813753.733:2177): pid=7428 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.733000 audit[7428]: CRED_DISP pid=7428 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:33.734000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@120-145.40.67.89:22-139.178.68.195:59664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:35.612000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.612000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002b77f50 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:42:35.612000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:42:35.612000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.612000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c001ab3760 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:42:35.612000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:42:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c003814840 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:42:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:42:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c002e842a0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:42:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:42:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c011b04030 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:42:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:42:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c01199a420 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:42:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0144053b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:42:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:42:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:42:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=6a a1=c0114c1740 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:42:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:42:38.265639 env[1475]: time="2024-02-13T08:42:38.265536831Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:42:38.321061 env[1475]: time="2024-02-13T08:42:38.320958082Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:38.321256 kubelet[2614]: E0213 08:42:38.321221 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:42:38.321662 kubelet[2614]: E0213 08:42:38.321279 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:42:38.321662 kubelet[2614]: E0213 08:42:38.321344 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:38.321662 kubelet[2614]: E0213 08:42:38.321388 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:42:38.747211 systemd[1]: Started sshd@121-145.40.67.89:22-139.178.68.195:52138.service. Feb 13 08:42:38.746000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@121-145.40.67.89:22-139.178.68.195:52138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:38.775006 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:42:38.775062 kernel: audit: type=1130 audit(1707813758.746:2187): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@121-145.40.67.89:22-139.178.68.195:52138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:38.891554 sshd[7488]: Accepted publickey for core from 139.178.68.195 port 52138 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:38.890000 audit[7488]: USER_ACCT pid=7488 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:38.892790 sshd[7488]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:38.895268 systemd-logind[1463]: New session 97 of user core. Feb 13 08:42:38.895718 systemd[1]: Started session-97.scope. Feb 13 08:42:38.972443 sshd[7488]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:38.973640 systemd[1]: sshd@121-145.40.67.89:22-139.178.68.195:52138.service: Deactivated successfully. Feb 13 08:42:38.974052 systemd[1]: session-97.scope: Deactivated successfully. Feb 13 08:42:38.974382 systemd-logind[1463]: Session 97 logged out. Waiting for processes to exit. Feb 13 08:42:38.974756 systemd-logind[1463]: Removed session 97. Feb 13 08:42:38.891000 audit[7488]: CRED_ACQ pid=7488 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:39.073576 kernel: audit: type=1101 audit(1707813758.890:2188): pid=7488 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:39.073626 kernel: audit: type=1103 audit(1707813758.891:2189): pid=7488 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:39.073646 kernel: audit: type=1006 audit(1707813758.891:2190): pid=7488 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=97 res=1 Feb 13 08:42:39.132189 kernel: audit: type=1300 audit(1707813758.891:2190): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff86f89ad0 a2=3 a3=0 items=0 ppid=1 pid=7488 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:38.891000 audit[7488]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff86f89ad0 a2=3 a3=0 items=0 ppid=1 pid=7488 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:39.224284 kernel: audit: type=1327 audit(1707813758.891:2190): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:38.891000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:39.254810 kernel: audit: type=1105 audit(1707813758.896:2191): pid=7488 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:38.896000 audit[7488]: USER_START pid=7488 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:38.897000 audit[7490]: CRED_ACQ pid=7490 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:39.438711 kernel: audit: type=1103 audit(1707813758.897:2192): pid=7490 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:39.438769 kernel: audit: type=1106 audit(1707813758.971:2193): pid=7488 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:38.971000 audit[7488]: USER_END pid=7488 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:38.972000 audit[7488]: CRED_DISP pid=7488 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:39.623747 kernel: audit: type=1104 audit(1707813758.972:2194): pid=7488 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:38.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@121-145.40.67.89:22-139.178.68.195:52138 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:40.266341 env[1475]: time="2024-02-13T08:42:40.266247121Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:42:40.282552 env[1475]: time="2024-02-13T08:42:40.282517108Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:40.282681 kubelet[2614]: E0213 08:42:40.282667 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:42:40.282894 kubelet[2614]: E0213 08:42:40.282699 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:42:40.282894 kubelet[2614]: E0213 08:42:40.282734 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:40.282894 kubelet[2614]: E0213 08:42:40.282767 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:42:40.953000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:40.953000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002353c80 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:42:40.953000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:42:40.953000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:40.953000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c00103d3e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:42:40.953000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:42:40.953000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:40.953000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c002353ca0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:42:40.953000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:42:40.953000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:42:40.953000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000649ac0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:42:40.953000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:42:42.265693 env[1475]: time="2024-02-13T08:42:42.265586724Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:42:42.292785 env[1475]: time="2024-02-13T08:42:42.292711736Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:42.292922 kubelet[2614]: E0213 08:42:42.292910 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:42:42.293099 kubelet[2614]: E0213 08:42:42.292958 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:42:42.293099 kubelet[2614]: E0213 08:42:42.292992 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:42.293099 kubelet[2614]: E0213 08:42:42.293010 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:42:43.983960 systemd[1]: Started sshd@122-145.40.67.89:22-139.178.68.195:52154.service. Feb 13 08:42:43.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@122-145.40.67.89:22-139.178.68.195:52154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:44.011283 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:42:44.011383 kernel: audit: type=1130 audit(1707813763.983:2200): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@122-145.40.67.89:22-139.178.68.195:52154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:44.129000 audit[7572]: USER_ACCT pid=7572 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.130952 sshd[7572]: Accepted publickey for core from 139.178.68.195 port 52154 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:44.135210 sshd[7572]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:44.144825 systemd-logind[1463]: New session 98 of user core. Feb 13 08:42:44.147103 systemd[1]: Started session-98.scope. Feb 13 08:42:44.133000 audit[7572]: CRED_ACQ pid=7572 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.231900 sshd[7572]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:44.233319 systemd[1]: sshd@122-145.40.67.89:22-139.178.68.195:52154.service: Deactivated successfully. Feb 13 08:42:44.233732 systemd[1]: session-98.scope: Deactivated successfully. Feb 13 08:42:44.234103 systemd-logind[1463]: Session 98 logged out. Waiting for processes to exit. Feb 13 08:42:44.234693 systemd-logind[1463]: Removed session 98. Feb 13 08:42:44.312694 kernel: audit: type=1101 audit(1707813764.129:2201): pid=7572 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.312730 kernel: audit: type=1103 audit(1707813764.133:2202): pid=7572 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.312765 kernel: audit: type=1006 audit(1707813764.133:2203): pid=7572 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=98 res=1 Feb 13 08:42:44.371352 kernel: audit: type=1300 audit(1707813764.133:2203): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeca5d54d0 a2=3 a3=0 items=0 ppid=1 pid=7572 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:44.133000 audit[7572]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeca5d54d0 a2=3 a3=0 items=0 ppid=1 pid=7572 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:44.463380 kernel: audit: type=1327 audit(1707813764.133:2203): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:44.133000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:44.153000 audit[7572]: USER_START pid=7572 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.588392 kernel: audit: type=1105 audit(1707813764.153:2204): pid=7572 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.588430 kernel: audit: type=1103 audit(1707813764.153:2205): pid=7574 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.153000 audit[7574]: CRED_ACQ pid=7574 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.228000 audit[7572]: USER_END pid=7572 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.773139 kernel: audit: type=1106 audit(1707813764.228:2206): pid=7572 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.773205 kernel: audit: type=1104 audit(1707813764.228:2207): pid=7572 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.228000 audit[7572]: CRED_DISP pid=7572 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:44.228000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@122-145.40.67.89:22-139.178.68.195:52154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:46.264700 env[1475]: time="2024-02-13T08:42:46.264654133Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:42:46.278358 env[1475]: time="2024-02-13T08:42:46.278322489Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:46.278547 kubelet[2614]: E0213 08:42:46.278502 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:42:46.278547 kubelet[2614]: E0213 08:42:46.278531 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:42:46.278760 kubelet[2614]: E0213 08:42:46.278556 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:46.278760 kubelet[2614]: E0213 08:42:46.278579 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:42:49.242021 systemd[1]: Started sshd@123-145.40.67.89:22-139.178.68.195:49420.service. Feb 13 08:42:49.241000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@123-145.40.67.89:22-139.178.68.195:49420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:49.265046 env[1475]: time="2024-02-13T08:42:49.264941631Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:42:49.269915 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:49.270228 kernel: audit: type=1130 audit(1707813769.241:2209): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@123-145.40.67.89:22-139.178.68.195:49420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:49.283118 env[1475]: time="2024-02-13T08:42:49.283039591Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:49.283261 kubelet[2614]: E0213 08:42:49.283245 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:42:49.283435 kubelet[2614]: E0213 08:42:49.283269 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:42:49.283435 kubelet[2614]: E0213 08:42:49.283290 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:49.283435 kubelet[2614]: E0213 08:42:49.283306 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:42:49.386000 audit[7626]: USER_ACCT pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.387320 sshd[7626]: Accepted publickey for core from 139.178.68.195 port 49420 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:49.388497 sshd[7626]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:49.390942 systemd-logind[1463]: New session 99 of user core. Feb 13 08:42:49.391412 systemd[1]: Started session-99.scope. Feb 13 08:42:49.468946 sshd[7626]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:49.470469 systemd[1]: sshd@123-145.40.67.89:22-139.178.68.195:49420.service: Deactivated successfully. Feb 13 08:42:49.470886 systemd[1]: session-99.scope: Deactivated successfully. Feb 13 08:42:49.471290 systemd-logind[1463]: Session 99 logged out. Waiting for processes to exit. Feb 13 08:42:49.471818 systemd-logind[1463]: Removed session 99. Feb 13 08:42:49.387000 audit[7626]: CRED_ACQ pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.569112 kernel: audit: type=1101 audit(1707813769.386:2210): pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.569169 kernel: audit: type=1103 audit(1707813769.387:2211): pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.569212 kernel: audit: type=1006 audit(1707813769.387:2212): pid=7626 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=99 res=1 Feb 13 08:42:49.387000 audit[7626]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6a80e640 a2=3 a3=0 items=0 ppid=1 pid=7626 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:49.627982 kernel: audit: type=1300 audit(1707813769.387:2212): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc6a80e640 a2=3 a3=0 items=0 ppid=1 pid=7626 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:49.387000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:49.750420 kernel: audit: type=1327 audit(1707813769.387:2212): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:49.750452 kernel: audit: type=1105 audit(1707813769.392:2213): pid=7626 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.392000 audit[7626]: USER_START pid=7626 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.392000 audit[7656]: CRED_ACQ pid=7656 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.934199 kernel: audit: type=1103 audit(1707813769.392:2214): pid=7656 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.934235 kernel: audit: type=1106 audit(1707813769.468:2215): pid=7626 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.468000 audit[7626]: USER_END pid=7626 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:50.029715 kernel: audit: type=1104 audit(1707813769.468:2216): pid=7626 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.468000 audit[7626]: CRED_DISP pid=7626 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:49.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@123-145.40.67.89:22-139.178.68.195:49420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:54.264435 env[1475]: time="2024-02-13T08:42:54.264382535Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:42:54.277726 env[1475]: time="2024-02-13T08:42:54.277687860Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:54.277883 kubelet[2614]: E0213 08:42:54.277871 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:42:54.278079 kubelet[2614]: E0213 08:42:54.277899 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:42:54.278079 kubelet[2614]: E0213 08:42:54.277925 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:54.278079 kubelet[2614]: E0213 08:42:54.277951 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:42:54.478640 systemd[1]: Started sshd@124-145.40.67.89:22-139.178.68.195:49432.service. Feb 13 08:42:54.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@124-145.40.67.89:22-139.178.68.195:49432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:54.505407 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:54.505459 kernel: audit: type=1130 audit(1707813774.476:2218): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@124-145.40.67.89:22-139.178.68.195:49432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:54.623000 audit[7708]: USER_ACCT pid=7708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.624449 sshd[7708]: Accepted publickey for core from 139.178.68.195 port 49432 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:54.626472 sshd[7708]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:54.629230 systemd-logind[1463]: New session 100 of user core. Feb 13 08:42:54.630273 systemd[1]: Started session-100.scope. Feb 13 08:42:54.707919 sshd[7708]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:54.709853 systemd[1]: sshd@124-145.40.67.89:22-139.178.68.195:49432.service: Deactivated successfully. Feb 13 08:42:54.710283 systemd[1]: session-100.scope: Deactivated successfully. Feb 13 08:42:54.710691 systemd-logind[1463]: Session 100 logged out. Waiting for processes to exit. Feb 13 08:42:54.711255 systemd-logind[1463]: Removed session 100. Feb 13 08:42:54.625000 audit[7708]: CRED_ACQ pid=7708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.806177 kernel: audit: type=1101 audit(1707813774.623:2219): pid=7708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.806223 kernel: audit: type=1103 audit(1707813774.625:2220): pid=7708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.806240 kernel: audit: type=1006 audit(1707813774.625:2221): pid=7708 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=100 res=1 Feb 13 08:42:54.864863 kernel: audit: type=1300 audit(1707813774.625:2221): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3ae86ae0 a2=3 a3=0 items=0 ppid=1 pid=7708 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:54.625000 audit[7708]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe3ae86ae0 a2=3 a3=0 items=0 ppid=1 pid=7708 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:54.956960 kernel: audit: type=1327 audit(1707813774.625:2221): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:54.625000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:54.987464 kernel: audit: type=1105 audit(1707813774.630:2222): pid=7708 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.630000 audit[7708]: USER_START pid=7708 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:55.082072 kernel: audit: type=1103 audit(1707813774.631:2223): pid=7710 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.631000 audit[7710]: CRED_ACQ pid=7710 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:55.171392 kernel: audit: type=1106 audit(1707813774.707:2224): pid=7708 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.707000 audit[7708]: USER_END pid=7708 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:55.267008 kernel: audit: type=1104 audit(1707813774.707:2225): pid=7708 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.707000 audit[7708]: CRED_DISP pid=7708 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:54.708000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@124-145.40.67.89:22-139.178.68.195:49432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:56.264868 env[1475]: time="2024-02-13T08:42:56.264842510Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:42:56.277546 env[1475]: time="2024-02-13T08:42:56.277483132Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:42:56.277675 kubelet[2614]: E0213 08:42:56.277659 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:42:56.277867 kubelet[2614]: E0213 08:42:56.277684 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:42:56.277867 kubelet[2614]: E0213 08:42:56.277709 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:42:56.277867 kubelet[2614]: E0213 08:42:56.277728 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:42:59.717519 systemd[1]: Started sshd@125-145.40.67.89:22-139.178.68.195:44796.service. Feb 13 08:42:59.715000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@125-145.40.67.89:22-139.178.68.195:44796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:59.744620 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:42:59.744679 kernel: audit: type=1130 audit(1707813779.715:2227): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@125-145.40.67.89:22-139.178.68.195:44796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:42:59.862000 audit[7761]: USER_ACCT pid=7761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:59.863425 sshd[7761]: Accepted publickey for core from 139.178.68.195 port 44796 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:42:59.865214 sshd[7761]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:42:59.867575 systemd-logind[1463]: New session 101 of user core. Feb 13 08:42:59.868205 systemd[1]: Started session-101.scope. Feb 13 08:42:59.944967 sshd[7761]: pam_unix(sshd:session): session closed for user core Feb 13 08:42:59.946677 systemd[1]: sshd@125-145.40.67.89:22-139.178.68.195:44796.service: Deactivated successfully. Feb 13 08:42:59.947543 systemd[1]: session-101.scope: Deactivated successfully. Feb 13 08:42:59.948194 systemd-logind[1463]: Session 101 logged out. Waiting for processes to exit. Feb 13 08:42:59.948773 systemd-logind[1463]: Removed session 101. Feb 13 08:42:59.863000 audit[7761]: CRED_ACQ pid=7761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.045520 kernel: audit: type=1101 audit(1707813779.862:2228): pid=7761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.045560 kernel: audit: type=1103 audit(1707813779.863:2229): pid=7761 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.045580 kernel: audit: type=1006 audit(1707813779.863:2230): pid=7761 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=101 res=1 Feb 13 08:43:00.104222 kernel: audit: type=1300 audit(1707813779.863:2230): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc96180830 a2=3 a3=0 items=0 ppid=1 pid=7761 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:42:59.863000 audit[7761]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc96180830 a2=3 a3=0 items=0 ppid=1 pid=7761 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:00.196424 kernel: audit: type=1327 audit(1707813779.863:2230): proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:59.863000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:42:59.868000 audit[7761]: USER_START pid=7761 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.265202 env[1475]: time="2024-02-13T08:43:00.265180685Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:43:00.265350 env[1475]: time="2024-02-13T08:43:00.265284714Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:43:00.277057 env[1475]: time="2024-02-13T08:43:00.276994386Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:00.277146 env[1475]: time="2024-02-13T08:43:00.277121688Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:00.277177 kubelet[2614]: E0213 08:43:00.277137 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:43:00.277177 kubelet[2614]: E0213 08:43:00.277162 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:43:00.277353 kubelet[2614]: E0213 08:43:00.277183 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:00.277353 kubelet[2614]: E0213 08:43:00.277194 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:43:00.277353 kubelet[2614]: E0213 08:43:00.277200 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:43:00.277353 kubelet[2614]: E0213 08:43:00.277209 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:43:00.277475 kubelet[2614]: E0213 08:43:00.277229 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:00.277475 kubelet[2614]: E0213 08:43:00.277242 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:43:00.321594 kernel: audit: type=1105 audit(1707813779.868:2231): pid=7761 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.321652 kernel: audit: type=1103 audit(1707813779.869:2232): pid=7763 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:59.869000 audit[7763]: CRED_ACQ pid=7763 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.410949 kernel: audit: type=1106 audit(1707813779.943:2233): pid=7761 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:59.943000 audit[7761]: USER_END pid=7761 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:59.943000 audit[7761]: CRED_DISP pid=7761 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:00.596233 kernel: audit: type=1104 audit(1707813779.943:2234): pid=7761 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:42:59.945000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@125-145.40.67.89:22-139.178.68.195:44796 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:04.309410 systemd[1]: Started sshd@126-145.40.67.89:22-161.35.108.241:50628.service. Feb 13 08:43:04.308000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@126-145.40.67.89:22-161.35.108.241:50628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:04.768353 sshd[7846]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:43:04.767000 audit[7846]: USER_AUTH pid=7846 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:43:04.795391 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:43:04.795475 kernel: audit: type=1100 audit(1707813784.767:2237): pid=7846 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:43:04.948740 systemd[1]: Started sshd@127-145.40.67.89:22-139.178.68.195:44804.service. Feb 13 08:43:04.947000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@127-145.40.67.89:22-139.178.68.195:44804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:05.037994 kernel: audit: type=1130 audit(1707813784.947:2238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@127-145.40.67.89:22-139.178.68.195:44804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:05.065000 audit[7849]: USER_ACCT pid=7849 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.066844 sshd[7849]: Accepted publickey for core from 139.178.68.195 port 44804 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:05.068241 sshd[7849]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:05.070591 systemd-logind[1463]: New session 102 of user core. Feb 13 08:43:05.071091 systemd[1]: Started session-102.scope. Feb 13 08:43:05.148709 sshd[7849]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:05.150037 systemd[1]: sshd@127-145.40.67.89:22-139.178.68.195:44804.service: Deactivated successfully. Feb 13 08:43:05.150439 systemd[1]: session-102.scope: Deactivated successfully. Feb 13 08:43:05.150772 systemd-logind[1463]: Session 102 logged out. Waiting for processes to exit. Feb 13 08:43:05.151238 systemd-logind[1463]: Removed session 102. Feb 13 08:43:05.067000 audit[7849]: CRED_ACQ pid=7849 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.249340 kernel: audit: type=1101 audit(1707813785.065:2239): pid=7849 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.249383 kernel: audit: type=1103 audit(1707813785.067:2240): pid=7849 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.249407 kernel: audit: type=1006 audit(1707813785.067:2241): pid=7849 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=102 res=1 Feb 13 08:43:05.265296 env[1475]: time="2024-02-13T08:43:05.265272649Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:43:05.277046 env[1475]: time="2024-02-13T08:43:05.276979242Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:05.277144 kubelet[2614]: E0213 08:43:05.277134 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:43:05.277306 kubelet[2614]: E0213 08:43:05.277160 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:43:05.277306 kubelet[2614]: E0213 08:43:05.277181 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:05.277306 kubelet[2614]: E0213 08:43:05.277199 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:43:05.067000 audit[7849]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd658ff6e0 a2=3 a3=0 items=0 ppid=1 pid=7849 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:05.400163 kernel: audit: type=1300 audit(1707813785.067:2241): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd658ff6e0 a2=3 a3=0 items=0 ppid=1 pid=7849 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:05.400208 kernel: audit: type=1327 audit(1707813785.067:2241): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:05.067000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:05.072000 audit[7849]: USER_START pid=7849 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.526230 kernel: audit: type=1105 audit(1707813785.072:2242): pid=7849 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.526303 kernel: audit: type=1103 audit(1707813785.072:2243): pid=7851 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.072000 audit[7851]: CRED_ACQ pid=7851 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.615644 kernel: audit: type=1106 audit(1707813785.148:2244): pid=7849 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.148000 audit[7849]: USER_END pid=7849 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.148000 audit[7849]: CRED_DISP pid=7849 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:05.149000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@127-145.40.67.89:22-139.178.68.195:44804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:06.212805 sshd[7846]: Failed password for root from 161.35.108.241 port 50628 ssh2 Feb 13 08:43:07.647578 systemd[1]: Started sshd@128-145.40.67.89:22-43.153.15.221:42038.service. Feb 13 08:43:07.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@128-145.40.67.89:22-43.153.15.221:42038 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:07.690091 sshd[7846]: Received disconnect from 161.35.108.241 port 50628:11: Bye Bye [preauth] Feb 13 08:43:07.690091 sshd[7846]: Disconnected from authenticating user root 161.35.108.241 port 50628 [preauth] Feb 13 08:43:07.692545 systemd[1]: sshd@126-145.40.67.89:22-161.35.108.241:50628.service: Deactivated successfully. Feb 13 08:43:07.692000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@126-145.40.67.89:22-161.35.108.241:50628 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:07.814383 sshd[7902]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:43:07.813000 audit[7902]: USER_AUTH pid=7902 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:43:09.264668 env[1475]: time="2024-02-13T08:43:09.264598488Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:43:09.278308 env[1475]: time="2024-02-13T08:43:09.278231382Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:09.278495 kubelet[2614]: E0213 08:43:09.278451 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:43:09.278671 kubelet[2614]: E0213 08:43:09.278526 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:43:09.278671 kubelet[2614]: E0213 08:43:09.278546 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:09.278671 kubelet[2614]: E0213 08:43:09.278563 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:43:09.670144 sshd[7902]: Failed password for root from 43.153.15.221 port 42038 ssh2 Feb 13 08:43:10.159105 systemd[1]: Started sshd@129-145.40.67.89:22-139.178.68.195:44212.service. Feb 13 08:43:10.158000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@129-145.40.67.89:22-139.178.68.195:44212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:10.186125 kernel: kauditd_printk_skb: 5 callbacks suppressed Feb 13 08:43:10.186184 kernel: audit: type=1130 audit(1707813790.158:2250): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@129-145.40.67.89:22-139.178.68.195:44212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:10.305000 audit[7932]: USER_ACCT pid=7932 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.306288 sshd[7932]: Accepted publickey for core from 139.178.68.195 port 44212 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:10.308218 sshd[7932]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:10.310582 systemd-logind[1463]: New session 103 of user core. Feb 13 08:43:10.311115 systemd[1]: Started session-103.scope. Feb 13 08:43:10.393322 sshd[7932]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:10.394689 systemd[1]: sshd@129-145.40.67.89:22-139.178.68.195:44212.service: Deactivated successfully. Feb 13 08:43:10.395111 systemd[1]: session-103.scope: Deactivated successfully. Feb 13 08:43:10.395474 systemd-logind[1463]: Session 103 logged out. Waiting for processes to exit. Feb 13 08:43:10.395877 systemd-logind[1463]: Removed session 103. Feb 13 08:43:10.307000 audit[7932]: CRED_ACQ pid=7932 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.489243 kernel: audit: type=1101 audit(1707813790.305:2251): pid=7932 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.489282 kernel: audit: type=1103 audit(1707813790.307:2252): pid=7932 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.489300 kernel: audit: type=1006 audit(1707813790.307:2253): pid=7932 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=103 res=1 Feb 13 08:43:10.548248 kernel: audit: type=1300 audit(1707813790.307:2253): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4b0be4f0 a2=3 a3=0 items=0 ppid=1 pid=7932 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:10.307000 audit[7932]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4b0be4f0 a2=3 a3=0 items=0 ppid=1 pid=7932 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:10.640880 kernel: audit: type=1327 audit(1707813790.307:2253): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:10.307000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:10.669594 sshd[7902]: Received disconnect from 43.153.15.221 port 42038:11: Bye Bye [preauth] Feb 13 08:43:10.669594 sshd[7902]: Disconnected from authenticating user root 43.153.15.221 port 42038 [preauth] Feb 13 08:43:10.670275 systemd[1]: sshd@128-145.40.67.89:22-43.153.15.221:42038.service: Deactivated successfully. Feb 13 08:43:10.671580 kernel: audit: type=1105 audit(1707813790.312:2254): pid=7932 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.312000 audit[7932]: USER_START pid=7932 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.312000 audit[7934]: CRED_ACQ pid=7934 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.856486 kernel: audit: type=1103 audit(1707813790.312:2255): pid=7934 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.856525 kernel: audit: type=1106 audit(1707813790.393:2256): pid=7932 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.393000 audit[7932]: USER_END pid=7932 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.952643 kernel: audit: type=1104 audit(1707813790.393:2257): pid=7932 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.393000 audit[7932]: CRED_DISP pid=7932 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:10.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@129-145.40.67.89:22-139.178.68.195:44212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:10.669000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@128-145.40.67.89:22-43.153.15.221:42038 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:11.264978 env[1475]: time="2024-02-13T08:43:11.264941112Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:43:11.277834 env[1475]: time="2024-02-13T08:43:11.277748451Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:11.277943 kubelet[2614]: E0213 08:43:11.277889 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:43:11.277943 kubelet[2614]: E0213 08:43:11.277916 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:43:11.278156 kubelet[2614]: E0213 08:43:11.277947 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:11.278156 kubelet[2614]: E0213 08:43:11.277968 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:43:15.265245 env[1475]: time="2024-02-13T08:43:15.265220188Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:43:15.278312 env[1475]: time="2024-02-13T08:43:15.278243284Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:15.278470 kubelet[2614]: E0213 08:43:15.278428 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:43:15.278470 kubelet[2614]: E0213 08:43:15.278458 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:43:15.278683 kubelet[2614]: E0213 08:43:15.278484 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:15.278683 kubelet[2614]: E0213 08:43:15.278506 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:43:15.403126 systemd[1]: Started sshd@130-145.40.67.89:22-139.178.68.195:44224.service. Feb 13 08:43:15.402000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@130-145.40.67.89:22-139.178.68.195:44224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:15.430375 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:43:15.430426 kernel: audit: type=1130 audit(1707813795.402:2260): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@130-145.40.67.89:22-139.178.68.195:44224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:15.548000 audit[8011]: USER_ACCT pid=8011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.549441 sshd[8011]: Accepted publickey for core from 139.178.68.195 port 44224 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:15.550209 sshd[8011]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:15.552366 systemd-logind[1463]: New session 104 of user core. Feb 13 08:43:15.552769 systemd[1]: Started session-104.scope. Feb 13 08:43:15.630138 sshd[8011]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:15.631442 systemd[1]: sshd@130-145.40.67.89:22-139.178.68.195:44224.service: Deactivated successfully. Feb 13 08:43:15.631845 systemd[1]: session-104.scope: Deactivated successfully. Feb 13 08:43:15.632261 systemd-logind[1463]: Session 104 logged out. Waiting for processes to exit. Feb 13 08:43:15.632748 systemd-logind[1463]: Removed session 104. Feb 13 08:43:15.549000 audit[8011]: CRED_ACQ pid=8011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.731598 kernel: audit: type=1101 audit(1707813795.548:2261): pid=8011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.731627 kernel: audit: type=1103 audit(1707813795.549:2262): pid=8011 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.731640 kernel: audit: type=1006 audit(1707813795.549:2263): pid=8011 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=104 res=1 Feb 13 08:43:15.790761 kernel: audit: type=1300 audit(1707813795.549:2263): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe30688f10 a2=3 a3=0 items=0 ppid=1 pid=8011 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:15.549000 audit[8011]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe30688f10 a2=3 a3=0 items=0 ppid=1 pid=8011 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:15.549000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:15.914112 kernel: audit: type=1327 audit(1707813795.549:2263): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:15.914145 kernel: audit: type=1105 audit(1707813795.554:2264): pid=8011 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.554000 audit[8011]: USER_START pid=8011 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:16.009276 kernel: audit: type=1103 audit(1707813795.554:2265): pid=8013 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.554000 audit[8013]: CRED_ACQ pid=8013 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.629000 audit[8011]: USER_END pid=8011 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:16.099029 kernel: audit: type=1106 audit(1707813795.629:2266): pid=8011 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.629000 audit[8011]: CRED_DISP pid=8011 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:16.283870 kernel: audit: type=1104 audit(1707813795.629:2267): pid=8011 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:15.630000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@130-145.40.67.89:22-139.178.68.195:44224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:19.265471 env[1475]: time="2024-02-13T08:43:19.265411906Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:43:19.279158 env[1475]: time="2024-02-13T08:43:19.279090019Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:19.279298 kubelet[2614]: E0213 08:43:19.279284 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:43:19.279522 kubelet[2614]: E0213 08:43:19.279314 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:43:19.279522 kubelet[2614]: E0213 08:43:19.279342 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:19.279522 kubelet[2614]: E0213 08:43:19.279366 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:43:20.265412 env[1475]: time="2024-02-13T08:43:20.265359636Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:43:20.279841 env[1475]: time="2024-02-13T08:43:20.279801864Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:20.280112 kubelet[2614]: E0213 08:43:20.279998 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:43:20.280112 kubelet[2614]: E0213 08:43:20.280028 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:43:20.280112 kubelet[2614]: E0213 08:43:20.280058 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:20.280112 kubelet[2614]: E0213 08:43:20.280081 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:43:20.641505 systemd[1]: Started sshd@131-145.40.67.89:22-139.178.68.195:46978.service. Feb 13 08:43:20.640000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@131-145.40.67.89:22-139.178.68.195:46978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:20.757356 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:43:20.757405 kernel: audit: type=1130 audit(1707813800.640:2269): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@131-145.40.67.89:22-139.178.68.195:46978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:20.786222 sshd[8089]: Accepted publickey for core from 139.178.68.195 port 46978 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:20.785000 audit[8089]: USER_ACCT pid=8089 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.788211 sshd[8089]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:20.790480 systemd-logind[1463]: New session 105 of user core. Feb 13 08:43:20.790970 systemd[1]: Started session-105.scope. Feb 13 08:43:20.871840 sshd[8089]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:20.873346 systemd[1]: sshd@131-145.40.67.89:22-139.178.68.195:46978.service: Deactivated successfully. Feb 13 08:43:20.873767 systemd[1]: session-105.scope: Deactivated successfully. Feb 13 08:43:20.874151 systemd-logind[1463]: Session 105 logged out. Waiting for processes to exit. Feb 13 08:43:20.874728 systemd-logind[1463]: Removed session 105. Feb 13 08:43:20.787000 audit[8089]: CRED_ACQ pid=8089 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.877935 kernel: audit: type=1101 audit(1707813800.785:2270): pid=8089 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.877962 kernel: audit: type=1103 audit(1707813800.787:2271): pid=8089 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:21.027040 kernel: audit: type=1006 audit(1707813800.787:2272): pid=8089 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=105 res=1 Feb 13 08:43:21.027076 kernel: audit: type=1300 audit(1707813800.787:2272): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca4738f90 a2=3 a3=0 items=0 ppid=1 pid=8089 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:20.787000 audit[8089]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca4738f90 a2=3 a3=0 items=0 ppid=1 pid=8089 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:21.119201 kernel: audit: type=1327 audit(1707813800.787:2272): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:20.787000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:21.149719 kernel: audit: type=1105 audit(1707813800.792:2273): pid=8089 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.792000 audit[8089]: USER_START pid=8089 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.792000 audit[8091]: CRED_ACQ pid=8091 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:21.333648 kernel: audit: type=1103 audit(1707813800.792:2274): pid=8091 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:21.333682 kernel: audit: type=1106 audit(1707813800.871:2275): pid=8089 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.871000 audit[8089]: USER_END pid=8089 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:21.429312 kernel: audit: type=1104 audit(1707813800.871:2276): pid=8089 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.871000 audit[8089]: CRED_DISP pid=8089 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:20.872000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@131-145.40.67.89:22-139.178.68.195:46978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:25.265121 env[1475]: time="2024-02-13T08:43:25.265053390Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:43:25.279299 env[1475]: time="2024-02-13T08:43:25.279259282Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:25.279443 kubelet[2614]: E0213 08:43:25.279428 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:43:25.279686 kubelet[2614]: E0213 08:43:25.279462 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:43:25.279686 kubelet[2614]: E0213 08:43:25.279504 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:25.279686 kubelet[2614]: E0213 08:43:25.279539 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:43:25.882449 systemd[1]: Started sshd@132-145.40.67.89:22-139.178.68.195:46980.service. Feb 13 08:43:25.881000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@132-145.40.67.89:22-139.178.68.195:46980 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:25.910161 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:43:25.910211 kernel: audit: type=1130 audit(1707813805.881:2278): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@132-145.40.67.89:22-139.178.68.195:46980 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:26.027000 audit[8145]: USER_ACCT pid=8145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.028384 sshd[8145]: Accepted publickey for core from 139.178.68.195 port 46980 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:26.029221 sshd[8145]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:26.031526 systemd-logind[1463]: New session 106 of user core. Feb 13 08:43:26.031968 systemd[1]: Started session-106.scope. Feb 13 08:43:26.110129 sshd[8145]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:26.111559 systemd[1]: sshd@132-145.40.67.89:22-139.178.68.195:46980.service: Deactivated successfully. Feb 13 08:43:26.111982 systemd[1]: session-106.scope: Deactivated successfully. Feb 13 08:43:26.112375 systemd-logind[1463]: Session 106 logged out. Waiting for processes to exit. Feb 13 08:43:26.112834 systemd-logind[1463]: Removed session 106. Feb 13 08:43:26.028000 audit[8145]: CRED_ACQ pid=8145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.211772 kernel: audit: type=1101 audit(1707813806.027:2279): pid=8145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.211812 kernel: audit: type=1103 audit(1707813806.028:2280): pid=8145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.211831 kernel: audit: type=1006 audit(1707813806.028:2281): pid=8145 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=106 res=1 Feb 13 08:43:26.270568 kernel: audit: type=1300 audit(1707813806.028:2281): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd511495a0 a2=3 a3=0 items=0 ppid=1 pid=8145 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:26.028000 audit[8145]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd511495a0 a2=3 a3=0 items=0 ppid=1 pid=8145 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:26.362671 kernel: audit: type=1327 audit(1707813806.028:2281): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:26.028000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:26.393205 kernel: audit: type=1105 audit(1707813806.033:2282): pid=8145 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.033000 audit[8145]: USER_START pid=8145 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.487837 kernel: audit: type=1103 audit(1707813806.033:2283): pid=8147 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.033000 audit[8147]: CRED_ACQ pid=8147 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.577271 kernel: audit: type=1106 audit(1707813806.109:2284): pid=8145 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.109000 audit[8145]: USER_END pid=8145 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.672936 kernel: audit: type=1104 audit(1707813806.109:2285): pid=8145 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.109000 audit[8145]: CRED_DISP pid=8145 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:26.110000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@132-145.40.67.89:22-139.178.68.195:46980 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:29.265251 env[1475]: time="2024-02-13T08:43:29.265227963Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:43:29.277757 env[1475]: time="2024-02-13T08:43:29.277726527Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:29.277851 kubelet[2614]: E0213 08:43:29.277829 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:43:29.277851 kubelet[2614]: E0213 08:43:29.277850 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:43:29.278052 kubelet[2614]: E0213 08:43:29.277877 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:29.278052 kubelet[2614]: E0213 08:43:29.277896 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:43:31.120204 systemd[1]: Started sshd@133-145.40.67.89:22-139.178.68.195:60916.service. Feb 13 08:43:31.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@133-145.40.67.89:22-139.178.68.195:60916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:31.147651 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:43:31.147700 kernel: audit: type=1130 audit(1707813811.119:2287): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@133-145.40.67.89:22-139.178.68.195:60916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:31.265220 env[1475]: time="2024-02-13T08:43:31.265146652Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:43:31.264000 audit[8197]: USER_ACCT pid=8197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.265666 sshd[8197]: Accepted publickey for core from 139.178.68.195 port 60916 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:31.267577 sshd[8197]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:31.272404 systemd-logind[1463]: New session 107 of user core. Feb 13 08:43:31.273525 systemd[1]: Started session-107.scope. Feb 13 08:43:31.282624 env[1475]: time="2024-02-13T08:43:31.282592765Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:31.282752 kubelet[2614]: E0213 08:43:31.282741 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:43:31.282907 kubelet[2614]: E0213 08:43:31.282766 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:43:31.282907 kubelet[2614]: E0213 08:43:31.282786 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:31.282907 kubelet[2614]: E0213 08:43:31.282806 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:43:31.352895 sshd[8197]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:31.354450 systemd[1]: sshd@133-145.40.67.89:22-139.178.68.195:60916.service: Deactivated successfully. Feb 13 08:43:31.354956 systemd[1]: session-107.scope: Deactivated successfully. Feb 13 08:43:31.355377 systemd-logind[1463]: Session 107 logged out. Waiting for processes to exit. Feb 13 08:43:31.355840 systemd-logind[1463]: Removed session 107. Feb 13 08:43:31.266000 audit[8197]: CRED_ACQ pid=8197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.447624 kernel: audit: type=1101 audit(1707813811.264:2288): pid=8197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.447661 kernel: audit: type=1103 audit(1707813811.266:2289): pid=8197 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.447677 kernel: audit: type=1006 audit(1707813811.266:2290): pid=8197 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=107 res=1 Feb 13 08:43:31.506372 kernel: audit: type=1300 audit(1707813811.266:2290): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5a764400 a2=3 a3=0 items=0 ppid=1 pid=8197 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:31.266000 audit[8197]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5a764400 a2=3 a3=0 items=0 ppid=1 pid=8197 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:31.598672 kernel: audit: type=1327 audit(1707813811.266:2290): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:31.266000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:31.629185 kernel: audit: type=1105 audit(1707813811.276:2291): pid=8197 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.276000 audit[8197]: USER_START pid=8197 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.723815 kernel: audit: type=1103 audit(1707813811.276:2292): pid=8220 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.276000 audit[8220]: CRED_ACQ pid=8220 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.813178 kernel: audit: type=1106 audit(1707813811.352:2293): pid=8197 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.352000 audit[8197]: USER_END pid=8197 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.908861 kernel: audit: type=1104 audit(1707813811.352:2294): pid=8197 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.352000 audit[8197]: CRED_DISP pid=8197 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:31.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@133-145.40.67.89:22-139.178.68.195:60916 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:33.266358 env[1475]: time="2024-02-13T08:43:33.266266377Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:43:33.315047 env[1475]: time="2024-02-13T08:43:33.314959761Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:33.315217 kubelet[2614]: E0213 08:43:33.315186 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:43:33.315511 kubelet[2614]: E0213 08:43:33.315230 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:43:33.315511 kubelet[2614]: E0213 08:43:33.315271 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:33.315511 kubelet[2614]: E0213 08:43:33.315307 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:43:35.613000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.613000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0007e6840 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:43:35.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:43:35.613000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.613000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c0026c6380 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:43:35.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:43:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0034a8b00 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:43:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:43:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00370c5a0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:43:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:43:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0106abb30 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:43:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:43:35.896000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.896000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c008c1f1a0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:43:35.896000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:43:35.897000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.897000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:35.897000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c011a86600 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:43:35.897000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:43:35.897000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c006c40900 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:43:35.897000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:43:36.365969 systemd[1]: Started sshd@134-145.40.67.89:22-139.178.68.195:33940.service. Feb 13 08:43:36.365000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@134-145.40.67.89:22-139.178.68.195:33940 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:36.370879 systemd[1]: Starting systemd-tmpfiles-clean.service... Feb 13 08:43:36.405868 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:43:36.406010 kernel: audit: type=1130 audit(1707813816.365:2304): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@134-145.40.67.89:22-139.178.68.195:33940 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:36.498771 systemd-tmpfiles[8284]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Feb 13 08:43:36.499000 systemd-tmpfiles[8284]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 08:43:36.499686 systemd-tmpfiles[8284]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 08:43:36.510133 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Feb 13 08:43:36.510229 systemd[1]: Finished systemd-tmpfiles-clean.service. Feb 13 08:43:36.509000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:36.522318 sshd[8283]: Accepted publickey for core from 139.178.68.195 port 33940 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:36.524003 sshd[8283]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:36.509000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:36.597604 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Feb 13 08:43:36.600156 systemd-logind[1463]: New session 108 of user core. Feb 13 08:43:36.600660 systemd[1]: Started session-108.scope. Feb 13 08:43:36.676026 sshd[8283]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:36.677492 systemd[1]: sshd@134-145.40.67.89:22-139.178.68.195:33940.service: Deactivated successfully. Feb 13 08:43:36.677921 systemd[1]: session-108.scope: Deactivated successfully. Feb 13 08:43:36.678344 systemd-logind[1463]: Session 108 logged out. Waiting for processes to exit. Feb 13 08:43:36.678842 systemd-logind[1463]: Removed session 108. Feb 13 08:43:36.683126 kernel: audit: type=1130 audit(1707813816.509:2305): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:36.683161 kernel: audit: type=1131 audit(1707813816.509:2306): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:36.683177 kernel: audit: type=1101 audit(1707813816.521:2307): pid=8283 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.521000 audit[8283]: USER_ACCT pid=8283 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.775279 kernel: audit: type=1103 audit(1707813816.522:2308): pid=8283 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.522000 audit[8283]: CRED_ACQ pid=8283 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.924574 kernel: audit: type=1006 audit(1707813816.522:2309): pid=8283 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=108 res=1 Feb 13 08:43:36.924650 kernel: audit: type=1300 audit(1707813816.522:2309): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe95ad30e0 a2=3 a3=0 items=0 ppid=1 pid=8283 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:36.522000 audit[8283]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe95ad30e0 a2=3 a3=0 items=0 ppid=1 pid=8283 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:37.017289 kernel: audit: type=1327 audit(1707813816.522:2309): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:36.522000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:37.048125 kernel: audit: type=1105 audit(1707813816.601:2310): pid=8283 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.601000 audit[8283]: USER_START pid=8283 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:37.143695 kernel: audit: type=1103 audit(1707813816.602:2311): pid=8287 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.602000 audit[8287]: CRED_ACQ pid=8287 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.675000 audit[8283]: USER_END pid=8283 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.675000 audit[8283]: CRED_DISP pid=8283 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:36.676000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@134-145.40.67.89:22-139.178.68.195:33940 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:40.265567 env[1475]: time="2024-02-13T08:43:40.265539874Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:43:40.265567 env[1475]: time="2024-02-13T08:43:40.265548944Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:43:40.280579 env[1475]: time="2024-02-13T08:43:40.280512858Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:40.280728 kubelet[2614]: E0213 08:43:40.280713 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:43:40.280975 kubelet[2614]: E0213 08:43:40.280744 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:43:40.280975 kubelet[2614]: E0213 08:43:40.280775 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:40.280975 kubelet[2614]: E0213 08:43:40.280798 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:43:40.280975 kubelet[2614]: E0213 08:43:40.280877 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:43:40.280975 kubelet[2614]: E0213 08:43:40.280888 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:43:40.281185 env[1475]: time="2024-02-13T08:43:40.280788729Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:40.281223 kubelet[2614]: E0213 08:43:40.280910 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:40.281223 kubelet[2614]: E0213 08:43:40.280933 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:43:40.954000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:40.954000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0026c7280 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:43:40.954000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:43:40.954000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:40.954000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c00267e7c0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:43:40.954000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:43:40.954000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:40.954000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c0026c72a0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:43:40.954000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:43:40.954000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:43:40.954000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c002684e20 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:43:40.954000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:43:41.685836 systemd[1]: Started sshd@135-145.40.67.89:22-139.178.68.195:33948.service. Feb 13 08:43:41.684000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@135-145.40.67.89:22-139.178.68.195:33948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:41.713059 kernel: kauditd_printk_skb: 15 callbacks suppressed Feb 13 08:43:41.713157 kernel: audit: type=1130 audit(1707813821.684:2319): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@135-145.40.67.89:22-139.178.68.195:33948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:41.830000 audit[8366]: USER_ACCT pid=8366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:41.831814 sshd[8366]: Accepted publickey for core from 139.178.68.195 port 33948 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:41.833078 sshd[8366]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:41.835469 systemd-logind[1463]: New session 109 of user core. Feb 13 08:43:41.835936 systemd[1]: Started session-109.scope. Feb 13 08:43:41.920317 sshd[8366]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:41.921681 systemd[1]: sshd@135-145.40.67.89:22-139.178.68.195:33948.service: Deactivated successfully. Feb 13 08:43:41.922159 systemd[1]: session-109.scope: Deactivated successfully. Feb 13 08:43:41.922526 systemd-logind[1463]: Session 109 logged out. Waiting for processes to exit. Feb 13 08:43:41.922910 systemd-logind[1463]: Removed session 109. Feb 13 08:43:41.831000 audit[8366]: CRED_ACQ pid=8366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:42.015491 kernel: audit: type=1101 audit(1707813821.830:2320): pid=8366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:42.015531 kernel: audit: type=1103 audit(1707813821.831:2321): pid=8366 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:41.831000 audit[8366]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffb4747ac0 a2=3 a3=0 items=0 ppid=1 pid=8366 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:42.166890 kernel: audit: type=1006 audit(1707813821.831:2322): pid=8366 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=109 res=1 Feb 13 08:43:42.166924 kernel: audit: type=1300 audit(1707813821.831:2322): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffb4747ac0 a2=3 a3=0 items=0 ppid=1 pid=8366 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:42.166943 kernel: audit: type=1327 audit(1707813821.831:2322): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:41.831000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:42.197408 kernel: audit: type=1105 audit(1707813821.837:2323): pid=8366 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:41.837000 audit[8366]: USER_START pid=8366 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:42.265289 env[1475]: time="2024-02-13T08:43:42.265215815Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:43:42.277177 env[1475]: time="2024-02-13T08:43:42.277115994Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:42.277271 kubelet[2614]: E0213 08:43:42.277253 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:43:42.277424 kubelet[2614]: E0213 08:43:42.277278 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:43:42.277424 kubelet[2614]: E0213 08:43:42.277299 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:42.277424 kubelet[2614]: E0213 08:43:42.277318 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:43:41.837000 audit[8368]: CRED_ACQ pid=8368 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:42.293025 kernel: audit: type=1103 audit(1707813821.837:2324): pid=8368 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:41.920000 audit[8366]: USER_END pid=8366 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:42.381994 kernel: audit: type=1106 audit(1707813821.920:2325): pid=8366 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:41.920000 audit[8366]: CRED_DISP pid=8366 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:42.566861 kernel: audit: type=1104 audit(1707813821.920:2326): pid=8366 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:41.920000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@135-145.40.67.89:22-139.178.68.195:33948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:46.929535 systemd[1]: Started sshd@136-145.40.67.89:22-139.178.68.195:40324.service. Feb 13 08:43:46.928000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@136-145.40.67.89:22-139.178.68.195:40324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:46.956501 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:43:46.956553 kernel: audit: type=1130 audit(1707813826.928:2328): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@136-145.40.67.89:22-139.178.68.195:40324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:47.074000 audit[8417]: USER_ACCT pid=8417 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.076030 sshd[8417]: Accepted publickey for core from 139.178.68.195 port 40324 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:47.079364 sshd[8417]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:47.084525 systemd-logind[1463]: New session 110 of user core. Feb 13 08:43:47.085076 systemd[1]: Started session-110.scope. Feb 13 08:43:47.163576 sshd[8417]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:47.165073 systemd[1]: sshd@136-145.40.67.89:22-139.178.68.195:40324.service: Deactivated successfully. Feb 13 08:43:47.165486 systemd[1]: session-110.scope: Deactivated successfully. Feb 13 08:43:47.165869 systemd-logind[1463]: Session 110 logged out. Waiting for processes to exit. Feb 13 08:43:47.166740 systemd-logind[1463]: Removed session 110. Feb 13 08:43:47.077000 audit[8417]: CRED_ACQ pid=8417 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.258982 kernel: audit: type=1101 audit(1707813827.074:2329): pid=8417 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.259020 kernel: audit: type=1103 audit(1707813827.077:2330): pid=8417 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.259040 kernel: audit: type=1006 audit(1707813827.077:2331): pid=8417 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=110 res=1 Feb 13 08:43:47.264434 env[1475]: time="2024-02-13T08:43:47.264391555Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:43:47.276509 env[1475]: time="2024-02-13T08:43:47.276442562Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:47.276633 kubelet[2614]: E0213 08:43:47.276600 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:43:47.276633 kubelet[2614]: E0213 08:43:47.276624 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:43:47.276809 kubelet[2614]: E0213 08:43:47.276646 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:47.276809 kubelet[2614]: E0213 08:43:47.276664 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:43:47.317285 kernel: audit: type=1300 audit(1707813827.077:2331): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4527b550 a2=3 a3=0 items=0 ppid=1 pid=8417 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:47.077000 audit[8417]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff4527b550 a2=3 a3=0 items=0 ppid=1 pid=8417 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:47.409322 kernel: audit: type=1327 audit(1707813827.077:2331): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:47.077000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:47.086000 audit[8417]: USER_START pid=8417 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.534540 kernel: audit: type=1105 audit(1707813827.086:2332): pid=8417 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.534574 kernel: audit: type=1103 audit(1707813827.087:2333): pid=8419 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.087000 audit[8419]: CRED_ACQ pid=8419 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.623949 kernel: audit: type=1106 audit(1707813827.163:2334): pid=8417 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.163000 audit[8417]: USER_END pid=8417 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.163000 audit[8417]: CRED_DISP pid=8417 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.809261 kernel: audit: type=1104 audit(1707813827.163:2335): pid=8417 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:47.164000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@136-145.40.67.89:22-139.178.68.195:40324 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:48.790609 systemd[1]: Started sshd@137-145.40.67.89:22-61.83.148.111:35204.service. Feb 13 08:43:48.789000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@137-145.40.67.89:22-61.83.148.111:35204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:49.578861 sshd[8469]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=61.83.148.111 user=root Feb 13 08:43:49.578000 audit[8469]: ANOM_LOGIN_FAILURES pid=8469 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='pam_faillock uid=0 exe="/usr/sbin/sshd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:49.578000 audit[8469]: USER_AUTH pid=8469 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=61.83.148.111 addr=61.83.148.111 terminal=ssh res=failed' Feb 13 08:43:49.579118 sshd[8469]: pam_faillock(sshd:auth): Consecutive login failures for user root account temporarily locked Feb 13 08:43:51.535131 sshd[8469]: Failed password for root from 61.83.148.111 port 35204 ssh2 Feb 13 08:43:52.172329 systemd[1]: Started sshd@138-145.40.67.89:22-139.178.68.195:40338.service. Feb 13 08:43:52.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@138-145.40.67.89:22-139.178.68.195:40338 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:52.199432 kernel: kauditd_printk_skb: 4 callbacks suppressed Feb 13 08:43:52.199524 kernel: audit: type=1130 audit(1707813832.171:2340): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@138-145.40.67.89:22-139.178.68.195:40338 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:52.227962 sshd[8472]: Accepted publickey for core from 139.178.68.195 port 40338 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:52.229251 sshd[8472]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:52.231859 systemd-logind[1463]: New session 111 of user core. Feb 13 08:43:52.232527 systemd[1]: Started session-111.scope. Feb 13 08:43:52.265560 env[1475]: time="2024-02-13T08:43:52.265536377Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:43:52.283735 env[1475]: time="2024-02-13T08:43:52.283675675Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:52.283889 kubelet[2614]: E0213 08:43:52.283876 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:43:52.284304 kubelet[2614]: E0213 08:43:52.283907 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:43:52.284304 kubelet[2614]: E0213 08:43:52.283964 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:52.284304 kubelet[2614]: E0213 08:43:52.283986 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:43:52.227000 audit[8472]: USER_ACCT pid=8472 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.312424 sshd[8472]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:52.313892 systemd[1]: sshd@138-145.40.67.89:22-139.178.68.195:40338.service: Deactivated successfully. Feb 13 08:43:52.314310 systemd[1]: session-111.scope: Deactivated successfully. Feb 13 08:43:52.314694 systemd-logind[1463]: Session 111 logged out. Waiting for processes to exit. Feb 13 08:43:52.315418 systemd-logind[1463]: Removed session 111. Feb 13 08:43:52.381708 kernel: audit: type=1101 audit(1707813832.227:2341): pid=8472 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.381756 kernel: audit: type=1103 audit(1707813832.228:2342): pid=8472 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.228000 audit[8472]: CRED_ACQ pid=8472 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.472316 kernel: audit: type=1006 audit(1707813832.228:2343): pid=8472 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=111 res=1 Feb 13 08:43:52.228000 audit[8472]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc3ebfdb0 a2=3 a3=0 items=0 ppid=1 pid=8472 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:52.571069 sshd[8469]: Received disconnect from 61.83.148.111 port 35204:11: Bye Bye [preauth] Feb 13 08:43:52.571069 sshd[8469]: Disconnected from authenticating user root 61.83.148.111 port 35204 [preauth] Feb 13 08:43:52.571657 systemd[1]: sshd@137-145.40.67.89:22-61.83.148.111:35204.service: Deactivated successfully. Feb 13 08:43:52.623150 kernel: audit: type=1300 audit(1707813832.228:2343): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffc3ebfdb0 a2=3 a3=0 items=0 ppid=1 pid=8472 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:52.623182 kernel: audit: type=1327 audit(1707813832.228:2343): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:52.228000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:52.234000 audit[8472]: USER_START pid=8472 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.748291 kernel: audit: type=1105 audit(1707813832.234:2344): pid=8472 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.748329 kernel: audit: type=1103 audit(1707813832.235:2345): pid=8474 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.235000 audit[8474]: CRED_ACQ pid=8474 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.837642 kernel: audit: type=1106 audit(1707813832.312:2346): pid=8472 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.312000 audit[8472]: USER_END pid=8472 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.933271 kernel: audit: type=1104 audit(1707813832.312:2347): pid=8472 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.312000 audit[8472]: CRED_DISP pid=8472 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:52.313000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@138-145.40.67.89:22-139.178.68.195:40338 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:52.570000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@137-145.40.67.89:22-61.83.148.111:35204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:53.265484 env[1475]: time="2024-02-13T08:43:53.265386988Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:43:53.290775 env[1475]: time="2024-02-13T08:43:53.290741208Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:53.291053 kubelet[2614]: E0213 08:43:53.290925 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:43:53.291053 kubelet[2614]: E0213 08:43:53.290978 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:43:53.291053 kubelet[2614]: E0213 08:43:53.291013 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:53.291053 kubelet[2614]: E0213 08:43:53.291029 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:43:54.264564 env[1475]: time="2024-02-13T08:43:54.264539413Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:43:54.277192 env[1475]: time="2024-02-13T08:43:54.277113822Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:54.277300 kubelet[2614]: E0213 08:43:54.277286 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:43:54.277353 kubelet[2614]: E0213 08:43:54.277315 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:43:54.277353 kubelet[2614]: E0213 08:43:54.277340 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:54.277433 kubelet[2614]: E0213 08:43:54.277359 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:43:57.322063 systemd[1]: Started sshd@139-145.40.67.89:22-139.178.68.195:46684.service. Feb 13 08:43:57.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@139-145.40.67.89:22-139.178.68.195:46684 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:57.348657 kernel: kauditd_printk_skb: 2 callbacks suppressed Feb 13 08:43:57.348718 kernel: audit: type=1130 audit(1707813837.321:2350): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@139-145.40.67.89:22-139.178.68.195:46684 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:57.466000 audit[8583]: USER_ACCT pid=8583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.467378 sshd[8583]: Accepted publickey for core from 139.178.68.195 port 46684 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:43:57.468246 sshd[8583]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:43:57.470628 systemd-logind[1463]: New session 112 of user core. Feb 13 08:43:57.471080 systemd[1]: Started session-112.scope. Feb 13 08:43:57.548371 sshd[8583]: pam_unix(sshd:session): session closed for user core Feb 13 08:43:57.549725 systemd[1]: sshd@139-145.40.67.89:22-139.178.68.195:46684.service: Deactivated successfully. Feb 13 08:43:57.550159 systemd[1]: session-112.scope: Deactivated successfully. Feb 13 08:43:57.550533 systemd-logind[1463]: Session 112 logged out. Waiting for processes to exit. Feb 13 08:43:57.550957 systemd-logind[1463]: Removed session 112. Feb 13 08:43:57.467000 audit[8583]: CRED_ACQ pid=8583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.649378 kernel: audit: type=1101 audit(1707813837.466:2351): pid=8583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.649415 kernel: audit: type=1103 audit(1707813837.467:2352): pid=8583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.649434 kernel: audit: type=1006 audit(1707813837.467:2353): pid=8583 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=112 res=1 Feb 13 08:43:57.708118 kernel: audit: type=1300 audit(1707813837.467:2353): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffcb1c5bd0 a2=3 a3=0 items=0 ppid=1 pid=8583 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:57.467000 audit[8583]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffcb1c5bd0 a2=3 a3=0 items=0 ppid=1 pid=8583 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:43:57.800234 kernel: audit: type=1327 audit(1707813837.467:2353): proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:57.467000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:43:57.830738 kernel: audit: type=1105 audit(1707813837.472:2354): pid=8583 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.472000 audit[8583]: USER_START pid=8583 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.925321 kernel: audit: type=1103 audit(1707813837.473:2355): pid=8585 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.473000 audit[8585]: CRED_ACQ pid=8585 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:58.014760 kernel: audit: type=1106 audit(1707813837.548:2356): pid=8583 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.548000 audit[8583]: USER_END pid=8583 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:58.110409 kernel: audit: type=1104 audit(1707813837.548:2357): pid=8583 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.548000 audit[8583]: CRED_DISP pid=8583 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:43:57.548000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@139-145.40.67.89:22-139.178.68.195:46684 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:43:58.264636 env[1475]: time="2024-02-13T08:43:58.264564924Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:43:58.278881 env[1475]: time="2024-02-13T08:43:58.278779151Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:43:58.279062 kubelet[2614]: E0213 08:43:58.278999 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:43:58.279062 kubelet[2614]: E0213 08:43:58.279054 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:43:58.279297 kubelet[2614]: E0213 08:43:58.279084 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:43:58.279297 kubelet[2614]: E0213 08:43:58.279106 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:44:02.509846 systemd[1]: Started sshd@140-145.40.67.89:22-161.35.108.241:58348.service. Feb 13 08:44:02.508000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@140-145.40.67.89:22-161.35.108.241:58348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:02.536840 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:44:02.536883 kernel: audit: type=1130 audit(1707813842.508:2359): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@140-145.40.67.89:22-161.35.108.241:58348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:02.627464 systemd[1]: Started sshd@141-145.40.67.89:22-139.178.68.195:46690.service. Feb 13 08:44:02.626000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@141-145.40.67.89:22-139.178.68.195:46690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:02.715968 kernel: audit: type=1130 audit(1707813842.626:2360): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@141-145.40.67.89:22-139.178.68.195:46690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:02.743000 audit[8641]: USER_ACCT pid=8641 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.744979 sshd[8641]: Accepted publickey for core from 139.178.68.195 port 46690 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:02.747940 sshd[8641]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:02.755617 systemd-logind[1463]: New session 113 of user core. Feb 13 08:44:02.757643 systemd[1]: Started session-113.scope. Feb 13 08:44:02.746000 audit[8641]: CRED_ACQ pid=8641 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.845729 sshd[8641]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:02.847238 systemd[1]: sshd@141-145.40.67.89:22-139.178.68.195:46690.service: Deactivated successfully. Feb 13 08:44:02.847702 systemd[1]: session-113.scope: Deactivated successfully. Feb 13 08:44:02.848143 systemd-logind[1463]: Session 113 logged out. Waiting for processes to exit. Feb 13 08:44:02.848760 systemd-logind[1463]: Removed session 113. Feb 13 08:44:02.929704 kernel: audit: type=1101 audit(1707813842.743:2361): pid=8641 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.929743 kernel: audit: type=1103 audit(1707813842.746:2362): pid=8641 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.929760 kernel: audit: type=1006 audit(1707813842.746:2363): pid=8641 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=113 res=1 Feb 13 08:44:02.988428 kernel: audit: type=1300 audit(1707813842.746:2363): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc1eb546e0 a2=3 a3=0 items=0 ppid=1 pid=8641 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:02.746000 audit[8641]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc1eb546e0 a2=3 a3=0 items=0 ppid=1 pid=8641 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:03.080542 kernel: audit: type=1327 audit(1707813842.746:2363): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:02.746000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:02.766000 audit[8641]: USER_START pid=8641 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:03.111988 kernel: audit: type=1105 audit(1707813842.766:2364): pid=8641 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:03.150548 sshd[8638]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=161.35.108.241 user=root Feb 13 08:44:02.768000 audit[8643]: CRED_ACQ pid=8643 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:03.296372 kernel: audit: type=1103 audit(1707813842.768:2365): pid=8643 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:03.296448 kernel: audit: type=1106 audit(1707813842.845:2366): pid=8641 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.845000 audit[8641]: USER_END pid=8641 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.845000 audit[8641]: CRED_DISP pid=8641 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:02.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@141-145.40.67.89:22-139.178.68.195:46690 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:03.149000 audit[8638]: USER_AUTH pid=8638 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=161.35.108.241 addr=161.35.108.241 terminal=ssh res=failed' Feb 13 08:44:03.435755 systemd[1]: Started sshd@142-145.40.67.89:22-43.153.15.221:60872.service. Feb 13 08:44:03.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@142-145.40.67.89:22-43.153.15.221:60872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:03.568450 sshd[8666]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=43.153.15.221 user=root Feb 13 08:44:03.567000 audit[8666]: USER_AUTH pid=8666 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:authentication grantors=? acct="root" exe="/usr/sbin/sshd" hostname=43.153.15.221 addr=43.153.15.221 terminal=ssh res=failed' Feb 13 08:44:04.264917 env[1475]: time="2024-02-13T08:44:04.264891783Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:44:04.277820 env[1475]: time="2024-02-13T08:44:04.277784215Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:04.277997 kubelet[2614]: E0213 08:44:04.277944 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:44:04.277997 kubelet[2614]: E0213 08:44:04.277971 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:44:04.277997 kubelet[2614]: E0213 08:44:04.277996 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:04.278283 kubelet[2614]: E0213 08:44:04.278016 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:44:05.266464 env[1475]: time="2024-02-13T08:44:05.266340287Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:44:05.266464 env[1475]: time="2024-02-13T08:44:05.266348678Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:44:05.315034 env[1475]: time="2024-02-13T08:44:05.314944934Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:05.315238 env[1475]: time="2024-02-13T08:44:05.315162157Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:05.315341 kubelet[2614]: E0213 08:44:05.315212 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:44:05.315341 kubelet[2614]: E0213 08:44:05.315256 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:44:05.315341 kubelet[2614]: E0213 08:44:05.315301 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:05.315341 kubelet[2614]: E0213 08:44:05.315339 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:44:05.315825 kubelet[2614]: E0213 08:44:05.315358 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:44:05.315825 kubelet[2614]: E0213 08:44:05.315391 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:44:05.315825 kubelet[2614]: E0213 08:44:05.315429 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:05.315825 kubelet[2614]: E0213 08:44:05.315459 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:44:05.362176 sshd[8638]: Failed password for root from 161.35.108.241 port 58348 ssh2 Feb 13 08:44:05.780565 sshd[8666]: Failed password for root from 43.153.15.221 port 60872 ssh2 Feb 13 08:44:06.078566 sshd[8638]: Received disconnect from 161.35.108.241 port 58348:11: Bye Bye [preauth] Feb 13 08:44:06.078566 sshd[8638]: Disconnected from authenticating user root 161.35.108.241 port 58348 [preauth] Feb 13 08:44:06.081205 systemd[1]: sshd@140-145.40.67.89:22-161.35.108.241:58348.service: Deactivated successfully. Feb 13 08:44:06.080000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@140-145.40.67.89:22-161.35.108.241:58348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:06.429491 sshd[8666]: Received disconnect from 43.153.15.221 port 60872:11: Bye Bye [preauth] Feb 13 08:44:06.429491 sshd[8666]: Disconnected from authenticating user root 43.153.15.221 port 60872 [preauth] Feb 13 08:44:06.432063 systemd[1]: sshd@142-145.40.67.89:22-43.153.15.221:60872.service: Deactivated successfully. Feb 13 08:44:06.431000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@142-145.40.67.89:22-43.153.15.221:60872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:07.855990 systemd[1]: Started sshd@143-145.40.67.89:22-139.178.68.195:56494.service. Feb 13 08:44:07.855000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@143-145.40.67.89:22-139.178.68.195:56494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:07.893717 kernel: kauditd_printk_skb: 7 callbacks suppressed Feb 13 08:44:07.893778 kernel: audit: type=1130 audit(1707813847.855:2374): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@143-145.40.67.89:22-139.178.68.195:56494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:08.009000 audit[8759]: USER_ACCT pid=8759 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.010742 sshd[8759]: Accepted publickey for core from 139.178.68.195 port 56494 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:08.011718 sshd[8759]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:08.014572 systemd-logind[1463]: New session 114 of user core. Feb 13 08:44:08.015595 systemd[1]: Started session-114.scope. Feb 13 08:44:08.096639 sshd[8759]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:08.098520 systemd[1]: sshd@143-145.40.67.89:22-139.178.68.195:56494.service: Deactivated successfully. Feb 13 08:44:08.099012 systemd[1]: session-114.scope: Deactivated successfully. Feb 13 08:44:08.099381 systemd-logind[1463]: Session 114 logged out. Waiting for processes to exit. Feb 13 08:44:08.099815 systemd-logind[1463]: Removed session 114. Feb 13 08:44:08.010000 audit[8759]: CRED_ACQ pid=8759 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.193744 kernel: audit: type=1101 audit(1707813848.009:2375): pid=8759 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.193786 kernel: audit: type=1103 audit(1707813848.010:2376): pid=8759 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.193807 kernel: audit: type=1006 audit(1707813848.010:2377): pid=8759 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=114 res=1 Feb 13 08:44:08.010000 audit[8759]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe577ad120 a2=3 a3=0 items=0 ppid=1 pid=8759 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:08.253994 kernel: audit: type=1300 audit(1707813848.010:2377): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe577ad120 a2=3 a3=0 items=0 ppid=1 pid=8759 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:08.010000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:08.376614 kernel: audit: type=1327 audit(1707813848.010:2377): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:08.376686 kernel: audit: type=1105 audit(1707813848.017:2378): pid=8759 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.017000 audit[8759]: USER_START pid=8759 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.471842 kernel: audit: type=1103 audit(1707813848.017:2379): pid=8761 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.017000 audit[8761]: CRED_ACQ pid=8761 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.096000 audit[8759]: USER_END pid=8759 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.657965 kernel: audit: type=1106 audit(1707813848.096:2380): pid=8759 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.658041 kernel: audit: type=1104 audit(1707813848.096:2381): pid=8759 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.096000 audit[8759]: CRED_DISP pid=8759 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:08.097000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@143-145.40.67.89:22-139.178.68.195:56494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:09.266082 env[1475]: time="2024-02-13T08:44:09.265966718Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:44:09.316115 env[1475]: time="2024-02-13T08:44:09.316049234Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:09.316387 kubelet[2614]: E0213 08:44:09.316332 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:44:09.316387 kubelet[2614]: E0213 08:44:09.316382 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:44:09.316849 kubelet[2614]: E0213 08:44:09.316433 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:09.316849 kubelet[2614]: E0213 08:44:09.316472 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:44:13.106137 systemd[1]: Started sshd@144-145.40.67.89:22-139.178.68.195:56498.service. Feb 13 08:44:13.105000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@144-145.40.67.89:22-139.178.68.195:56498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:13.133599 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:44:13.133673 kernel: audit: type=1130 audit(1707813853.105:2383): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@144-145.40.67.89:22-139.178.68.195:56498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:13.252160 sshd[8815]: Accepted publickey for core from 139.178.68.195 port 56498 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:13.251000 audit[8815]: USER_ACCT pid=8815 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.252869 sshd[8815]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:13.254898 systemd-logind[1463]: New session 115 of user core. Feb 13 08:44:13.255383 systemd[1]: Started session-115.scope. Feb 13 08:44:13.251000 audit[8815]: CRED_ACQ pid=8815 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.434446 kernel: audit: type=1101 audit(1707813853.251:2384): pid=8815 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.434486 kernel: audit: type=1103 audit(1707813853.251:2385): pid=8815 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.434506 kernel: audit: type=1006 audit(1707813853.251:2386): pid=8815 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=115 res=1 Feb 13 08:44:13.493543 kernel: audit: type=1300 audit(1707813853.251:2386): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde9903960 a2=3 a3=0 items=0 ppid=1 pid=8815 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:13.251000 audit[8815]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffde9903960 a2=3 a3=0 items=0 ppid=1 pid=8815 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:13.586200 kernel: audit: type=1327 audit(1707813853.251:2386): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:13.251000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:13.586395 sshd[8815]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:13.587785 systemd[1]: sshd@144-145.40.67.89:22-139.178.68.195:56498.service: Deactivated successfully. Feb 13 08:44:13.588207 systemd[1]: session-115.scope: Deactivated successfully. Feb 13 08:44:13.588586 systemd-logind[1463]: Session 115 logged out. Waiting for processes to exit. Feb 13 08:44:13.588990 systemd-logind[1463]: Removed session 115. Feb 13 08:44:13.616921 kernel: audit: type=1105 audit(1707813853.256:2387): pid=8815 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.256000 audit[8815]: USER_START pid=8815 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.712075 kernel: audit: type=1103 audit(1707813853.256:2388): pid=8817 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.256000 audit[8817]: CRED_ACQ pid=8817 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.801448 kernel: audit: type=1106 audit(1707813853.586:2389): pid=8815 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.586000 audit[8815]: USER_END pid=8815 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.897113 kernel: audit: type=1104 audit(1707813853.586:2390): pid=8815 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.586000 audit[8815]: CRED_DISP pid=8815 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:13.586000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@144-145.40.67.89:22-139.178.68.195:56498 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:16.264988 env[1475]: time="2024-02-13T08:44:16.264914260Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:44:16.278993 env[1475]: time="2024-02-13T08:44:16.278908076Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:16.279114 kubelet[2614]: E0213 08:44:16.279100 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:44:16.279331 kubelet[2614]: E0213 08:44:16.279130 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:44:16.279331 kubelet[2614]: E0213 08:44:16.279159 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:16.279331 kubelet[2614]: E0213 08:44:16.279182 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:44:18.265370 env[1475]: time="2024-02-13T08:44:18.265312554Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:44:18.279295 env[1475]: time="2024-02-13T08:44:18.279230125Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:18.279402 kubelet[2614]: E0213 08:44:18.279377 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:44:18.279589 kubelet[2614]: E0213 08:44:18.279404 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:44:18.279589 kubelet[2614]: E0213 08:44:18.279430 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:18.279589 kubelet[2614]: E0213 08:44:18.279450 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:44:18.536233 systemd[1]: Started sshd@145-145.40.67.89:22-139.178.68.195:37362.service. Feb 13 08:44:18.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@145-145.40.67.89:22-139.178.68.195:37362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:18.563294 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:44:18.563372 kernel: audit: type=1130 audit(1707813858.535:2392): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@145-145.40.67.89:22-139.178.68.195:37362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:18.679000 audit[8904]: USER_ACCT pid=8904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.680503 sshd[8904]: Accepted publickey for core from 139.178.68.195 port 37362 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:18.682226 sshd[8904]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:18.684613 systemd-logind[1463]: New session 116 of user core. Feb 13 08:44:18.685039 systemd[1]: Started session-116.scope. Feb 13 08:44:18.762146 sshd[8904]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:18.763571 systemd[1]: sshd@145-145.40.67.89:22-139.178.68.195:37362.service: Deactivated successfully. Feb 13 08:44:18.763989 systemd[1]: session-116.scope: Deactivated successfully. Feb 13 08:44:18.764412 systemd-logind[1463]: Session 116 logged out. Waiting for processes to exit. Feb 13 08:44:18.764865 systemd-logind[1463]: Removed session 116. Feb 13 08:44:18.681000 audit[8904]: CRED_ACQ pid=8904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.862509 kernel: audit: type=1101 audit(1707813858.679:2393): pid=8904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.862540 kernel: audit: type=1103 audit(1707813858.681:2394): pid=8904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.862560 kernel: audit: type=1006 audit(1707813858.681:2395): pid=8904 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=116 res=1 Feb 13 08:44:18.921238 kernel: audit: type=1300 audit(1707813858.681:2395): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3fc38d40 a2=3 a3=0 items=0 ppid=1 pid=8904 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:18.681000 audit[8904]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3fc38d40 a2=3 a3=0 items=0 ppid=1 pid=8904 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:19.013503 kernel: audit: type=1327 audit(1707813858.681:2395): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:18.681000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:19.044084 kernel: audit: type=1105 audit(1707813858.686:2396): pid=8904 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.686000 audit[8904]: USER_START pid=8904 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:19.138726 kernel: audit: type=1103 audit(1707813858.686:2397): pid=8906 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.686000 audit[8906]: CRED_ACQ pid=8906 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:19.228186 kernel: audit: type=1106 audit(1707813858.761:2398): pid=8904 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.761000 audit[8904]: USER_END pid=8904 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:19.323868 kernel: audit: type=1104 audit(1707813858.761:2399): pid=8904 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.761000 audit[8904]: CRED_DISP pid=8904 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:18.762000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@145-145.40.67.89:22-139.178.68.195:37362 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:20.264480 env[1475]: time="2024-02-13T08:44:20.264448015Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:44:20.279450 env[1475]: time="2024-02-13T08:44:20.279372775Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:20.279561 kubelet[2614]: E0213 08:44:20.279529 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:44:20.279561 kubelet[2614]: E0213 08:44:20.279560 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:44:20.279789 kubelet[2614]: E0213 08:44:20.279588 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:20.279789 kubelet[2614]: E0213 08:44:20.279612 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:44:23.772063 systemd[1]: Started sshd@146-145.40.67.89:22-139.178.68.195:37368.service. Feb 13 08:44:23.771000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@146-145.40.67.89:22-139.178.68.195:37368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:23.799085 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:44:23.799133 kernel: audit: type=1130 audit(1707813863.771:2401): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@146-145.40.67.89:22-139.178.68.195:37368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:23.916000 audit[8958]: USER_ACCT pid=8958 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:23.917443 sshd[8958]: Accepted publickey for core from 139.178.68.195 port 37368 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:23.918529 sshd[8958]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:23.920983 systemd-logind[1463]: New session 117 of user core. Feb 13 08:44:23.921483 systemd[1]: Started session-117.scope. Feb 13 08:44:23.999267 sshd[8958]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:24.000685 systemd[1]: sshd@146-145.40.67.89:22-139.178.68.195:37368.service: Deactivated successfully. Feb 13 08:44:24.001112 systemd[1]: session-117.scope: Deactivated successfully. Feb 13 08:44:24.001538 systemd-logind[1463]: Session 117 logged out. Waiting for processes to exit. Feb 13 08:44:24.002010 systemd-logind[1463]: Removed session 117. Feb 13 08:44:23.917000 audit[8958]: CRED_ACQ pid=8958 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:24.010996 kernel: audit: type=1101 audit(1707813863.916:2402): pid=8958 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:24.011029 kernel: audit: type=1103 audit(1707813863.917:2403): pid=8958 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:24.160011 kernel: audit: type=1006 audit(1707813863.917:2404): pid=8958 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=117 res=1 Feb 13 08:44:24.160048 kernel: audit: type=1300 audit(1707813863.917:2404): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff952f2fe0 a2=3 a3=0 items=0 ppid=1 pid=8958 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:23.917000 audit[8958]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff952f2fe0 a2=3 a3=0 items=0 ppid=1 pid=8958 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:23.917000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:24.264924 env[1475]: time="2024-02-13T08:44:24.264903983Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:44:24.276354 env[1475]: time="2024-02-13T08:44:24.276293539Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:24.276449 kubelet[2614]: E0213 08:44:24.276435 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:44:24.276601 kubelet[2614]: E0213 08:44:24.276460 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:44:24.276601 kubelet[2614]: E0213 08:44:24.276482 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:24.276601 kubelet[2614]: E0213 08:44:24.276500 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:44:24.282758 kernel: audit: type=1327 audit(1707813863.917:2404): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:24.282793 kernel: audit: type=1105 audit(1707813863.922:2405): pid=8958 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:23.922000 audit[8958]: USER_START pid=8958 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:24.377434 kernel: audit: type=1103 audit(1707813863.923:2406): pid=8960 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:23.923000 audit[8960]: CRED_ACQ pid=8960 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:24.466862 kernel: audit: type=1106 audit(1707813863.999:2407): pid=8958 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:23.999000 audit[8958]: USER_END pid=8958 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:23.999000 audit[8958]: CRED_DISP pid=8958 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:24.652107 kernel: audit: type=1104 audit(1707813863.999:2408): pid=8958 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:23.999000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@146-145.40.67.89:22-139.178.68.195:37368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:28.265435 env[1475]: time="2024-02-13T08:44:28.265384465Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:44:28.278938 env[1475]: time="2024-02-13T08:44:28.278893388Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:28.279111 kubelet[2614]: E0213 08:44:28.279097 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:44:28.279311 kubelet[2614]: E0213 08:44:28.279129 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:44:28.279311 kubelet[2614]: E0213 08:44:28.279158 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:28.279311 kubelet[2614]: E0213 08:44:28.279181 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:44:29.008855 systemd[1]: Started sshd@147-145.40.67.89:22-139.178.68.195:48340.service. Feb 13 08:44:29.008000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@147-145.40.67.89:22-139.178.68.195:48340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:29.036176 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:44:29.036274 kernel: audit: type=1130 audit(1707813869.008:2410): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@147-145.40.67.89:22-139.178.68.195:48340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:29.154000 audit[9041]: USER_ACCT pid=9041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.155108 sshd[9041]: Accepted publickey for core from 139.178.68.195 port 48340 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:29.158234 sshd[9041]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:29.160627 systemd-logind[1463]: New session 118 of user core. Feb 13 08:44:29.161173 systemd[1]: Started session-118.scope. Feb 13 08:44:29.239559 sshd[9041]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:29.241016 systemd[1]: sshd@147-145.40.67.89:22-139.178.68.195:48340.service: Deactivated successfully. Feb 13 08:44:29.241426 systemd[1]: session-118.scope: Deactivated successfully. Feb 13 08:44:29.241757 systemd-logind[1463]: Session 118 logged out. Waiting for processes to exit. Feb 13 08:44:29.242301 systemd-logind[1463]: Removed session 118. Feb 13 08:44:29.157000 audit[9041]: CRED_ACQ pid=9041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.337466 kernel: audit: type=1101 audit(1707813869.154:2411): pid=9041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.337509 kernel: audit: type=1103 audit(1707813869.157:2412): pid=9041 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.337526 kernel: audit: type=1006 audit(1707813869.157:2413): pid=9041 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=118 res=1 Feb 13 08:44:29.396203 kernel: audit: type=1300 audit(1707813869.157:2413): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc82853090 a2=3 a3=0 items=0 ppid=1 pid=9041 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:29.157000 audit[9041]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc82853090 a2=3 a3=0 items=0 ppid=1 pid=9041 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:29.488394 kernel: audit: type=1327 audit(1707813869.157:2413): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:29.157000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:29.518912 kernel: audit: type=1105 audit(1707813869.162:2414): pid=9041 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.162000 audit[9041]: USER_START pid=9041 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.613526 kernel: audit: type=1103 audit(1707813869.163:2415): pid=9043 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.163000 audit[9043]: CRED_ACQ pid=9043 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.239000 audit[9041]: USER_END pid=9041 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.798610 kernel: audit: type=1106 audit(1707813869.239:2416): pid=9041 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.798648 kernel: audit: type=1104 audit(1707813869.239:2417): pid=9041 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.239000 audit[9041]: CRED_DISP pid=9041 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:29.240000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@147-145.40.67.89:22-139.178.68.195:48340 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:32.266379 env[1475]: time="2024-02-13T08:44:32.266260791Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:44:32.292913 env[1475]: time="2024-02-13T08:44:32.292856863Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:32.293095 kubelet[2614]: E0213 08:44:32.293050 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:44:32.293095 kubelet[2614]: E0213 08:44:32.293073 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:44:32.293095 kubelet[2614]: E0213 08:44:32.293093 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:32.293321 kubelet[2614]: E0213 08:44:32.293109 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:44:34.251112 systemd[1]: Started sshd@148-145.40.67.89:22-139.178.68.195:48348.service. Feb 13 08:44:34.250000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@148-145.40.67.89:22-139.178.68.195:48348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:34.278605 kernel: kauditd_printk_skb: 1 callbacks suppressed Feb 13 08:44:34.278673 kernel: audit: type=1130 audit(1707813874.250:2419): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@148-145.40.67.89:22-139.178.68.195:48348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:34.394000 audit[9094]: USER_ACCT pid=9094 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.395385 sshd[9094]: Accepted publickey for core from 139.178.68.195 port 48348 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:34.397253 sshd[9094]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:34.399640 systemd-logind[1463]: New session 119 of user core. Feb 13 08:44:34.400156 systemd[1]: Started session-119.scope. Feb 13 08:44:34.476882 sshd[9094]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:34.478347 systemd[1]: sshd@148-145.40.67.89:22-139.178.68.195:48348.service: Deactivated successfully. Feb 13 08:44:34.478764 systemd[1]: session-119.scope: Deactivated successfully. Feb 13 08:44:34.479125 systemd-logind[1463]: Session 119 logged out. Waiting for processes to exit. Feb 13 08:44:34.479636 systemd-logind[1463]: Removed session 119. Feb 13 08:44:34.396000 audit[9094]: CRED_ACQ pid=9094 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.577413 kernel: audit: type=1101 audit(1707813874.394:2420): pid=9094 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.577451 kernel: audit: type=1103 audit(1707813874.396:2421): pid=9094 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.577468 kernel: audit: type=1006 audit(1707813874.396:2422): pid=9094 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=119 res=1 Feb 13 08:44:34.636135 kernel: audit: type=1300 audit(1707813874.396:2422): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffea0a752f0 a2=3 a3=0 items=0 ppid=1 pid=9094 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:34.396000 audit[9094]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffea0a752f0 a2=3 a3=0 items=0 ppid=1 pid=9094 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:34.396000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:34.758914 kernel: audit: type=1327 audit(1707813874.396:2422): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:34.758949 kernel: audit: type=1105 audit(1707813874.401:2423): pid=9094 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.401000 audit[9094]: USER_START pid=9094 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.401000 audit[9096]: CRED_ACQ pid=9096 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.943010 kernel: audit: type=1103 audit(1707813874.401:2424): pid=9096 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.943055 kernel: audit: type=1106 audit(1707813874.476:2425): pid=9094 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.476000 audit[9094]: USER_END pid=9094 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:35.038849 kernel: audit: type=1104 audit(1707813874.476:2426): pid=9094 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.476000 audit[9094]: CRED_DISP pid=9094 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:34.477000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@148-145.40.67.89:22-139.178.68.195:48348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:35.265534 env[1475]: time="2024-02-13T08:44:35.265319405Z" level=info msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\"" Feb 13 08:44:35.313763 env[1475]: time="2024-02-13T08:44:35.313675425Z" level=error msg="StopPodSandbox for \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\" failed" error="failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:35.314033 kubelet[2614]: E0213 08:44:35.313974 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4" Feb 13 08:44:35.314033 kubelet[2614]: E0213 08:44:35.314024 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4} Feb 13 08:44:35.314468 kubelet[2614]: E0213 08:44:35.314079 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:35.314468 kubelet[2614]: E0213 08:44:35.314120 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59c850d0-0181-4283-967e-a70a4b1b7e64\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3f6419eac067254cb1a0da846aff37435588ab18328664c44e4b6afa3f6e1fb4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b75d6c9c-m8j85" podUID=59c850d0-0181-4283-967e-a70a4b1b7e64 Feb 13 08:44:35.613000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.613000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.613000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c000aebd40 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:44:35.613000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c001103ec0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:44:35.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:44:35.613000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:44:35.897000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.897000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c00a197e00 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:44:35.897000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:44:35.897000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-ca.crt" dev="sdb9" ino=525077 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.897000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c00b6828d0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:44:35.897000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:44:35.898000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.898000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c003a41d00 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:44:35.898000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:44:35.898000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/front-proxy-client.crt" dev="sdb9" ino=525079 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.898000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=66 a1=c0114886c0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:44:35.898000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:44:35.898000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/apiserver.crt" dev="sdb9" ino=525073 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.898000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=67 a1=c0114313b0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:44:35.898000 audit[2443]: AVC avc: denied { watch } for pid=2443 comm="kube-apiserver" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:35.898000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:44:35.898000 audit[2443]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=68 a1=c00323c8a0 a2=fc6 a3=0 items=0 ppid=2272 pid=2443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-apiserver" exe="/usr/local/bin/kube-apiserver" subj=system_u:system_r:svirt_lxc_net_t:s0:c440,c750 key=(null) Feb 13 08:44:35.898000 audit: PROCTITLE proctitle=6B7562652D617069736572766572002D2D6164766572746973652D616464726573733D3134352E34302E36372E3839002D2D616C6C6F772D70726976696C656765643D74727565002D2D617574686F72697A6174696F6E2D6D6F64653D4E6F64652C52424143002D2D636C69656E742D63612D66696C653D2F6574632F6B7562 Feb 13 08:44:38.265217 env[1475]: time="2024-02-13T08:44:38.265183548Z" level=info msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\"" Feb 13 08:44:38.282018 env[1475]: time="2024-02-13T08:44:38.281987525Z" level=error msg="StopPodSandbox for \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\" failed" error="failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:38.282130 kubelet[2614]: E0213 08:44:38.282121 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a" Feb 13 08:44:38.282281 kubelet[2614]: E0213 08:44:38.282144 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a} Feb 13 08:44:38.282281 kubelet[2614]: E0213 08:44:38.282168 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:38.282281 kubelet[2614]: E0213 08:44:38.282185 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1dbcd21d-393a-43bf-9e25-9f59bc66daab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"616f85ba9a832d7119abccf4b7cee2d45e57d6ff43154d99982663d3241c567a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-2shjr" podUID=1dbcd21d-393a-43bf-9e25-9f59bc66daab Feb 13 08:44:39.488337 systemd[1]: Started sshd@149-145.40.67.89:22-139.178.68.195:44998.service. Feb 13 08:44:39.487000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@149-145.40.67.89:22-139.178.68.195:44998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:39.516030 kernel: kauditd_printk_skb: 25 callbacks suppressed Feb 13 08:44:39.516117 kernel: audit: type=1130 audit(1707813879.487:2436): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@149-145.40.67.89:22-139.178.68.195:44998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:39.634000 audit[9180]: USER_ACCT pid=9180 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.635103 sshd[9180]: Accepted publickey for core from 139.178.68.195 port 44998 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:39.636176 sshd[9180]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:39.638460 systemd-logind[1463]: New session 120 of user core. Feb 13 08:44:39.638887 systemd[1]: Started session-120.scope. Feb 13 08:44:39.719764 sshd[9180]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:39.721093 systemd[1]: sshd@149-145.40.67.89:22-139.178.68.195:44998.service: Deactivated successfully. Feb 13 08:44:39.721494 systemd[1]: session-120.scope: Deactivated successfully. Feb 13 08:44:39.721856 systemd-logind[1463]: Session 120 logged out. Waiting for processes to exit. Feb 13 08:44:39.722705 systemd-logind[1463]: Removed session 120. Feb 13 08:44:39.635000 audit[9180]: CRED_ACQ pid=9180 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.817200 kernel: audit: type=1101 audit(1707813879.634:2437): pid=9180 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.817282 kernel: audit: type=1103 audit(1707813879.635:2438): pid=9180 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.817301 kernel: audit: type=1006 audit(1707813879.635:2439): pid=9180 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=120 res=1 Feb 13 08:44:39.876190 kernel: audit: type=1300 audit(1707813879.635:2439): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc760a940 a2=3 a3=0 items=0 ppid=1 pid=9180 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:39.635000 audit[9180]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc760a940 a2=3 a3=0 items=0 ppid=1 pid=9180 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:39.635000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:39.999093 kernel: audit: type=1327 audit(1707813879.635:2439): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:39.999124 kernel: audit: type=1105 audit(1707813879.640:2440): pid=9180 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.640000 audit[9180]: USER_START pid=9180 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:40.093788 kernel: audit: type=1103 audit(1707813879.641:2441): pid=9182 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.641000 audit[9182]: CRED_ACQ pid=9182 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:40.183242 kernel: audit: type=1106 audit(1707813879.719:2442): pid=9180 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.719000 audit[9180]: USER_END pid=9180 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.719000 audit[9180]: CRED_DISP pid=9180 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:40.368474 kernel: audit: type=1104 audit(1707813879.719:2443): pid=9180 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:39.720000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@149-145.40.67.89:22-139.178.68.195:44998 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:40.955000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:40.955000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c002399880 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:44:40.955000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:44:40.956000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:40.956000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=b a1=c000bfc160 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:44:40.956000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:44:40.956000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:40.956000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=c a1=c001f62900 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:44:40.956000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:44:40.956000 audit[2442]: AVC avc: denied { watch } for pid=2442 comm="kube-controller" path="/etc/kubernetes/pki/ca.crt" dev="sdb9" ino=524806 scontext=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 tcontext=system_u:object_r:etc_t:s0 tclass=file permissive=0 Feb 13 08:44:40.956000 audit[2442]: SYSCALL arch=c000003e syscall=254 success=no exit=-13 a0=d a1=c0023555e0 a2=fc6 a3=0 items=0 ppid=2294 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kube-controller" exe="/usr/local/bin/kube-controller-manager" subj=system_u:system_r:svirt_lxc_net_t:s0:c694,c824 key=(null) Feb 13 08:44:40.956000 audit: PROCTITLE proctitle=6B7562652D636F6E74726F6C6C65722D6D616E61676572002D2D616C6C6F636174652D6E6F64652D63696472733D74727565002D2D61757468656E7469636174696F6E2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F636F6E74726F6C6C65722D6D616E616765722E636F6E66002D2D617574686F7269 Feb 13 08:44:41.265531 env[1475]: time="2024-02-13T08:44:41.265446598Z" level=info msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\"" Feb 13 08:44:41.278693 env[1475]: time="2024-02-13T08:44:41.278652080Z" level=error msg="StopPodSandbox for \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\" failed" error="failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:41.278854 kubelet[2614]: E0213 08:44:41.278839 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9" Feb 13 08:44:41.279076 kubelet[2614]: E0213 08:44:41.278870 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9} Feb 13 08:44:41.279076 kubelet[2614]: E0213 08:44:41.278898 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:41.279076 kubelet[2614]: E0213 08:44:41.278920 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"767512d8-ec8b-4a84-be29-2de84e2dbb6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"25fdaa4925a07b4a8fe24bbf745c522ab38928226e44b4825ec7862d641459e9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8wh4k" podUID=767512d8-ec8b-4a84-be29-2de84e2dbb6e Feb 13 08:44:44.264448 env[1475]: time="2024-02-13T08:44:44.264421856Z" level=info msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\"" Feb 13 08:44:44.277061 env[1475]: time="2024-02-13T08:44:44.276998817Z" level=error msg="StopPodSandbox for \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\" failed" error="failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 08:44:44.277217 kubelet[2614]: E0213 08:44:44.277174 2614 remote_runtime.go:205] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37" Feb 13 08:44:44.277217 kubelet[2614]: E0213 08:44:44.277200 2614 kuberuntime_manager.go:965] "Failed to stop sandbox" podSandboxID={Type:containerd ID:e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37} Feb 13 08:44:44.277445 kubelet[2614]: E0213 08:44:44.277224 2614 kuberuntime_manager.go:705] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Feb 13 08:44:44.277445 kubelet[2614]: E0213 08:44:44.277244 2614 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8659a212-b7b0-442b-a180-caeaa9464f84\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e1b9e601e8c8def5463ae48ade6add6c6642447a7e672b1df539126d40d2ba37\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-787d4945fb-9lrbl" podUID=8659a212-b7b0-442b-a180-caeaa9464f84 Feb 13 08:44:44.729395 systemd[1]: Started sshd@150-145.40.67.89:22-139.178.68.195:45010.service. Feb 13 08:44:44.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@150-145.40.67.89:22-139.178.68.195:45010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:44.756625 kernel: kauditd_printk_skb: 13 callbacks suppressed Feb 13 08:44:44.756679 kernel: audit: type=1130 audit(1707813884.728:2449): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@150-145.40.67.89:22-139.178.68.195:45010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Feb 13 08:44:44.874000 audit[9265]: USER_ACCT pid=9265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:44.875412 sshd[9265]: Accepted publickey for core from 139.178.68.195 port 45010 ssh2: RSA SHA256:1PlZIJIBJggYI3VbAvHiWZPn3uvIsILcfs6l5/y3kqg Feb 13 08:44:44.877213 sshd[9265]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Feb 13 08:44:44.879558 systemd-logind[1463]: New session 121 of user core. Feb 13 08:44:44.880070 systemd[1]: Started session-121.scope. Feb 13 08:44:44.959073 sshd[9265]: pam_unix(sshd:session): session closed for user core Feb 13 08:44:44.960547 systemd[1]: sshd@150-145.40.67.89:22-139.178.68.195:45010.service: Deactivated successfully. Feb 13 08:44:44.961030 systemd[1]: session-121.scope: Deactivated successfully. Feb 13 08:44:44.961427 systemd-logind[1463]: Session 121 logged out. Waiting for processes to exit. Feb 13 08:44:44.962149 systemd-logind[1463]: Removed session 121. Feb 13 08:44:44.876000 audit[9265]: CRED_ACQ pid=9265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:45.059565 kernel: audit: type=1101 audit(1707813884.874:2450): pid=9265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:45.059605 kernel: audit: type=1103 audit(1707813884.876:2451): pid=9265 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:45.059622 kernel: audit: type=1006 audit(1707813884.876:2452): pid=9265 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=121 res=1 Feb 13 08:44:45.118343 kernel: audit: type=1300 audit(1707813884.876:2452): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2c2c21a0 a2=3 a3=0 items=0 ppid=1 pid=9265 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=121 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:44.876000 audit[9265]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc2c2c21a0 a2=3 a3=0 items=0 ppid=1 pid=9265 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=121 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Feb 13 08:44:45.210548 kernel: audit: type=1327 audit(1707813884.876:2452): proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:44.876000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Feb 13 08:44:45.241063 kernel: audit: type=1105 audit(1707813884.881:2453): pid=9265 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:44.881000 audit[9265]: USER_START pid=9265 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:45.335763 kernel: audit: type=1103 audit(1707813884.882:2454): pid=9267 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:44.882000 audit[9267]: CRED_ACQ pid=9267 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:45.425185 kernel: audit: type=1106 audit(1707813884.958:2455): pid=9265 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:44.958000 audit[9265]: USER_END pid=9265 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:45.520899 kernel: audit: type=1104 audit(1707813884.958:2456): pid=9265 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:44.958000 audit[9265]: CRED_DISP pid=9265 uid=0 auid=500 ses=121 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Feb 13 08:44:44.959000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@150-145.40.67.89:22-139.178.68.195:45010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'